camera and sonar visualization

as the last exercise proposed to the workshop datajockey, the processing and arduino codes below create a visual representation of two different phenomenas: first, processing grabs the input video coming from a camera attached to the computer and converts it into a matrix of colored pixels. these pixels are modified real-time, depending on the distance of an object being detected by a sonar, which is attached to an arduino board.

here is the code for the processing side of the experiment. the video library used here is gsvideo, since it works in linux:

 

//camera-sonar module visualization
//by medul.la
//based on the example 'mirror', of the gsvideo processing library,
//and the code found in this forum post by dvnanness:
//http://forum.processing.org/topic/multiple-sonar-reading-from-arduino-to-processing
 
import processing.serial.*;
import codeanticode.gsvideo.*;
 
// Number of columns and rows in our system
int cols, rows;
// Variable for capture device
GSCapture video;
 
Serial myPort;
int numSensors = 1;
int linefeed = 10;
int sensors[];
float read1;
int cellZFactor;
 
//camera-sonar m
 
// Size of each cell in the grid
int cellSize = 15;
 
void setup() {
size(640, 480, P3D);
//set up columns and rows
cols = width / cellSize;
rows = height / cellSize;
colorMode(RGB, 255, 255, 255, 100);
rectMode(CENTER);
 
// uses the default video input, see the reference if this causes an error
video = new GSCapture(this, width, height, 30);
 
// list all the available serial ports
println(Serial.list());
 
// change the 0 to the appropriate number of the serial port
// that your microcontroller is attached to.
myPort = new Serial(this, Serial.list()[0], 9600);
myPort.bufferUntil(linefeed);
}
 
void draw() {
if (sensors != null) {
 
// if valid data arrays are not null
// compare each sensor value with the previuos reading
// to establish change
 
read1 = map(sensors[0], 0, 600, 1, 30);
cellZFactor = int(read1);
}
if (video.available()) {
 
// Size of each
video.read();
video.loadPixels();
 
background(0);
 
// begin loop for columns
for (int i = 0; i < cols;i++) {
// begin loop for rows
for (int j = 0; j < rows;j++) {
 
// where are we, pixel-wise?
int x = i * cellSize;
int y = j * cellSize;
int loc = (video.width - x - 1) + y*video.width; // reversing x to mirror the image
 
// the rects' color and z position depends on the information from the sonar input,
// the brightness and the colors captured by the camera
color c = video.pixels[loc];
float sz = (brightness(c) / 255.0) * cellSize + cellZFactor;
fill(red(c)/cellZFactor,(blue(c)+(cellZFactor)), (green(c)*(cellZFactor)/3));
noStroke();
rect(x + cellSize/2, y + cellSize/2, sz, sz);
}
}
}
}
 
void serialEvent(Serial myPort) {
 
// read the serial buffer:
String myString = myPort.readStringUntil(linefeed);
 
// if you got any bytes other than the linefeed:
if (myString != null) {
 
myString = trim(myString);
 
// split the string at the commas
// and convert the sections into integers:
 
sensors = int(split(myString, '\n'));
 
// print out the values you got:
 for (int sensorNum = 0; sensorNum < sensors.length; sensorNum++) {
print("Sensor " + sensorNum + ": " + sensors[sensorNum] + "\t");
}
// add a linefeed after all the sensor values are printed:
println();
}
video.read();
}

and here is the arduino code. This is arranged for sonars similar to the model HC-SR04.

 

//defining ports
const int pingPin = 7;
const int pingPin8 = 8;
 
void setup() {
Serial.begin(9600);
}
 
void loop(){
long duration, inches, cm;
pinMode(pingPin, OUTPUT);
digitalWrite(pingPin, LOW);
delayMicroseconds(2);
digitalWrite(pingPin, HIGH);
delayMicroseconds(5);
digitalWrite(pingPin, LOW);
 
pinMode(pingPin8, INPUT);
duration = pulseIn(pingPin8, HIGH);
 
inches = microsecondsToInches(duration);
cm = microsecondsToCentimeters(duration);
Serial.println(cm);
 
delay(100);
}
 
long microsecondsToInches(long microseconds){
return microseconds / 74 / 2;
}
long microsecondsToCentimeters(long microseconds){
return microseconds / 29 / 2;
}

more details on how to hook a HC-SR04 to an arduino board here.

Posted in Mistakes
0 comments on “camera and sonar visualization
1 Pings/Trackbacks for "camera and sonar visualization"
  1. […] i’m honored to announce that i’ll be giving a workshop at flexible lab, buenos aires, in a partnership with laboratorio de juguete. the workshop will be based on the previous one that took place in the Museum of Image and Sound of São Paulo. there will be 4 meetings. in the first one, a discussion on the subject of computation and data visualization is proposed. for the next 3 meetings, 3 exercises are proposed: one of text data visualzation; one of access and appropriation of the twitter databank; and the last one, an experiment with data coming from the physical world. […]

Leave a Reply

Your email address will not be published. Required fields are marked *

*