OSCelleton for Kinect

Start with setting up the OpenNI, OpenNite and sensorkinect. Follow the steps in here: http://tohmjudson.com/?p=30

And this is the simplified version of skeleton tracking. It is using simple ellipses to draw the skeleton.

/*simple processing skeletoon sketch for kinect
Instead of using arraylists, I have done this example by using PVector and ellipses
It works for only one user.
Mustafa Bagdatli, http://www.mustafabagdatli.com
February 28, 2011
*/
import oscP5.*;
import netP5.*;
OscP5 oscP5;
boolean userExist = false;
PVector head = new PVector(0,0,0);
PVector neck = new PVector(0,0,0);
PVector r_collar = new PVector(0,0,0);
PVector r_shoulder = new PVector(0,0,0);
PVector r_elbow = new PVector(0,0,0);
PVector r_wrist = new PVector(0,0,0);
PVector r_hand = new PVector(0,0,0);
PVector r_finger = new PVector(0,0,0);
PVector r_collar2 = new PVector(0,0,0);
PVector l_shoulder = new PVector(0,0,0);
PVector l_elbow = new PVector(0,0,0);
PVector l_wrist = new PVector(0,0,0);
PVector l_hand = new PVector(0,0,0);
PVector l_finger = new PVector(0,0,0);
PVector torso = new PVector(0,0,0);
PVector r_hip = new PVector(0,0,0);
PVector r_knee = new PVector(0,0,0);
PVector r_ankle = new PVector(0,0,0);
PVector r_foot = new PVector(0,0,0);
PVector l_hip = new PVector(0,0,0);
PVector l_knee = new PVector(0,0,0);
PVector l_ankle = new PVector(0,0,0);
PVector l_foot = new PVector(0,0,0);
void setup() {
size(screen.height*4/3, screen.height);    // use OPENGL rendering for bilinear filtering on texture
smooth();
oscP5 = new OscP5(this, “127.0.0.1″, 7110);
// Initialize box2d physics and create the world
}
/* incoming osc message are forwarded to the oscEvent method. */
// Here you can easily see the format of the OSC messages sent. For each user, the joints are named with
// the joint named followed by user ID (head0, neck0 …. r_foot0; head1, neck1…..)
void oscEvent(OscMessage msg) {
if (msg.checkAddrPattern(“/joint”) && msg.checkTypetag(“sifff”)) {
// We have received joint coordinates, let’s find out which skeleton/joint and save the values ;)
Integer id = msg.get(1).intValue();
if (msg.get(0).stringValue().equals(“head”)) {
head = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“neck”)) {
neck = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_collar”)) {
r_collar = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_shoulder”)) {
r_shoulder = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_elbow”)) {
r_elbow = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_wrist”)) {
r_wrist = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_hand”)) {
r_hand = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_finger”)) {
r_finger = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_collar”)) {
r_collar2 = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_shoulder”)) {
l_shoulder = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_elbow”)) {
l_elbow = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_wrist”)) {
l_wrist = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_hand”)) {
l_hand = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_finger”)) {
l_finger = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“torso”)) {
torso = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_hip”)) {
r_hip = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_knee”)) {
r_knee = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_ankle”)) {
r_ankle = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“r_foot”)) {
r_foot = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_hip”)) {
l_hip = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_knee”)) {
l_knee = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_ankle”)) {
l_ankle = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
else if (msg.get(0).stringValue().equals(“l_foot”)) {
l_foot = new PVector(msg.get(2).floatValue(),msg.get(3).floatValue(),msg.get(4).floatValue());
}
}
else if (msg.checkAddrPattern(“/new_user”) && msg.checkTypetag(“i”)) {
// A new user is in front of the kinect… Tell him to do the calibration pose!
println(“New user with ID = ” + msg.get(0).intValue());
userExist = true;
}
else if(msg.checkAddrPattern(“/new_skel”) && msg.checkTypetag(“i”)) {
//New skeleton calibrated! Lets create it!
Integer id = msg.get(0).intValue();
}
else if(msg.checkAddrPattern(“/lost_user”) && msg.checkTypetag(“i”)) {
//Lost user/skeleton
userExist = false;
Integer id = msg.get(0).intValue();
println(“Lost user ” + id);
}
}
void draw() {
background(0);
noStroke();
println(userExist);
if(userExist) {
fill(255);
//values that we are receiving are for x,y positions are between 0,1.0
//that’s why I am multiplying them with width and height of the screen.
ellipse(head.x*width,head.y*height,50/head.z,50/head.z);
ellipse(r_collar.x*width,r_collar.y*height,30/r_collar.z,30/r_collar.z);
ellipse(r_shoulder.x*width,r_shoulder.y*height,30/r_shoulder.z, 30/r_shoulder.z);
ellipse( r_elbow.x*width, r_elbow.y*height,30/r_elbow.z,30/r_elbow.z);
ellipse(r_wrist.x*width,r_wrist.y*height,30/r_wrist.z,30/r_wrist.z);
ellipse( r_hand.x*width, r_hand.y*height,30/r_hand.z,30/r_hand.z);
ellipse(r_finger.x*width,r_finger.y*height,30/r_finger.z, 30/r_finger.z);
ellipse(r_collar2.x*width,r_collar2.y*height,30/r_collar2.z, 30/r_collar2.z);
ellipse(l_shoulder.x*width,l_shoulder.y*height,30/l_shoulder.z, 30/l_shoulder.z);
ellipse(l_elbow.x*width,l_elbow.y*height,30/l_elbow.z,30/l_elbow.z);
ellipse(l_wrist.x*width,l_wrist.y*height,30/l_wrist.z, 30/l_wrist.z);
ellipse(l_hand.x*width,l_hand.y*height,30/l_hand.z, 30/l_hand.z);
ellipse(l_finger.x*width,l_finger.y*height,30/l_finger.z,30/l_finger.z);
ellipse(torso.x*width,torso.y*height,30/torso.z,30/torso.z);
ellipse(r_hip.x*width,r_hip.y*height,30/r_hip.z, 30/r_hip.z);
ellipse(r_knee.x*width,r_knee.y*height,30/r_knee.z, 30/r_knee.z);
ellipse(r_ankle.x*width,r_ankle.y*height,30/r_ankle.z, 30/r_ankle.z);
ellipse(r_foot.x*width,r_foot.y*height,30/r_foot.z,30/r_foot.z);
ellipse(l_hip.x*width,l_hip.y*height,30/l_hip.z, 30/l_hip.z);
ellipse(l_knee.x*width,l_knee.y*height,30/l_knee.z,30/l_knee.z);
ellipse(l_ankle.x*width,l_ankle.y*height,30/l_ankle.z, 30/l_ankle.z);
ellipse(l_foot.x*width,l_foot.y*height,30/l_foot.z,30/l_foot.z);
}
}

I2C tutorial

Here is a tutorial that I prepared for Arduino I2C connection. Arduino.cc has a good tutorial on I2C, I have added my experience on top of what they have to prepare this tutorial. So, you can see some things from Arduino site. I2C lets you communicate between multiple Arduino’s. It works with a master Arduino, and 1 or more Arduino(s), that can be used for input or output.

To make the hardware connection, you need to connect one ground and 5V pins on each Arduino. Also, for data transfer, you need to connect SDA and SCL pins.

I2C is a synchronized communication way between Arduino’s, So, you don’t need to worry about synchronization. We are using 2 pins for I2C. One of them is SDA which is the data line, and the other one is SCL which is the clock line that keeps multiple Arduino’s synchronized according to master.

On most Arduino boards, SDA (data line) is on analog input pin 4, and SCL (clock line) is on analog input pin 5. On the Arduino Mega, SDA is digital pin 20 and SCL is 21. (got it from http://arduino.cc/en/Reference/Wire)

Functions

Begin:

Wire.begin()

Wire.begin(address)

Description

Initiate the Wire library and join the I2C bus as a master or slave.

Address is the name of the Arduino module, in your conversation between Arduino’s. It is really important to define addresses for each Arduino, because it is the simplest way to identify them. If address is not specified, it can be used as master. For slaves, address should be determined.

Parameters

Address: the 7-bit slave address (0-127).

Returns

None

Example

void setup(){

Wire.begin(4);                // join i2c bus with address #4

}

Wire.requestFrom(address, quantity)

Description

This function is for, when master requests bytes from a slave. You need to define the address of the slave that you are asking bytes, also number of bytes that you are asking from that slave.

The bytes may then be retrieved with the available() and receive() functions.

Parameters

address: the 7-bit address of the device to request bytes from

quantity: the number of bytes to request

Returns

None

Example

for master

// by Nicholas Zambetti <http://www.zambetti.com>
#include <Wire.h>

void setup()
{
Wire.begin();        // join i2c bus (address optional for master)
Serial.begin(9600); // serial connection
}

void loop()
{
Wire.requestFrom(2, 6);    // request 6 bytes from slave device #2

while(Wire.available())    // slave may send less than requested
{
char c = Wire.receive(); // receive a byte as character, you can chance “char” according to what you are sending from your slave (ex: byte, float etc.)
Serial.print(c);         // print the character
}

delay(500);
}

See Also

Wire.beginTransmission(address)

Description

Begin transmitting data to the I2C slave device with the given address. This function calls master device that, you are about to start transmitting data to defined slave. Subsequently, queue bytes for transmission with the send() function and transmit them by calling endTransmission().

Parameters

address: the 7-bit address of the device to transmit to

Returns

None

Example

for master

// by Nicholas Zambetti <http://www.zambetti.com>

#include <Wire.h>

void setup()
{
Wire.begin(); // join i2c bus (address optional for master)
}

byte x = 0; //data to transmit

void loop()
{
Wire.beginTransmission(4); // transmit to device #4
Wire.send(“x is “);        // sends five bytes
Wire.send(x);              // sends one byte
Wire.endTransmission();    // stop transmitting, this function makes the transmission

x++;
delay(500);
}

See Also

Wire.endTransmission()

Description

Ends a transmission to a slave device that was begun by beginTransmission() and actually transmits the bytes that were queued by send().

Parameters

None

Returns

None

See Also

Wire.send(value)

Wire.send(string)

Wire.send(data, quantity)

Description

Sends data from a slave device in response to a request from a master, or queues bytes for transmission from a master to slave device (in-between calls to beginTransmission() and endTransmission()).

Parameters

value: a byte to send (byte)

string: a string to send (char *)

data: an array of data to send (byte *)

quantity: the number of bytes of data to transmit (byte)

Returns

None

See Also

Wire.available()

Description

Returns the number of bytes available for retrieval with receive(). This should be called on a master device after a call to requestFrom() or on a slave inside the onReceive() handler.

Parameters

None

Returns

The number of bytes available for reading.

Example

for master

// by Nicholas Zambetti <http://www.zambetti.com>

void loop()
{
Wire.requestFrom(2, 6);    // request 6 bytes from slave device #2

while(Wire.available())    // slave may send less than requested
{
char c = Wire.receive(); // receive a byte as character
Serial.print(c);         // print the character
}

delay(500);
}

See Also

byte Wire.receive()

Description

Retrieve a byte that was transmitted from a slave device to a master after a call to requestFrom or was transmitted from a master to a slave.

Parameters

None

Returns

The next byte received.

Example

for slave

// by Nicholas Zambetti <http://www.zambetti.com>

void loop()
{
Wire.requestFrom(2, 6);    // request 6 bytes from slave device #2

while(Wire.available())    // slave may send less than requested
{
char c = Wire.receive(); // receive a byte as character
Serial.print(c);         // print the character
}

delay(500);
}

See Also

Wire.onReceive(handler)

Description

Registers a function to be called when a slave device receives a transmission from a master.

Wire.onRecive(handler) should be in setup function. Whenever slave receives a data from master, Wire,onReceive(handler) calls the handler function to handle the receiving process from master.

Parameters

handler: the function to be called when the slave receives data; this should take a single int parameter (the number of bytes received from the master) and return nothing, e.g.: void myHandler(int numBytes)

Example

for slave

// by Nicholas Zambetti <http://www.zambetti.com>
#include <Wire.h>

void setup()
{
Wire.begin(4);                // join i2c bus with address #4
Wire.onReceive(receiveEvent); // register event
Serial.begin(9600);           // start serial for output
}

void loop()
{
delay(100);
}

// function that executes whenever data is received from master
// this function is registered as an event, see setup()
void receiveEvent(int howMany)
{
while(1 < Wire.available()) // loop through all but the last
{
char c = Wire.receive(); // receive byte as a character
Serial.print(c);         // print the character
}

}

Returns

None

See Also

Wire.onRequest(handler)

Description

Register a function to be called when a master requests data from this slave device.

Wire.onRequest(handler) should be in setup function. Whenever master requests a data from slave, Wire,onRequest(handler) calls the handler function to handle the request process from master.

Parameters

handler: the function to be called, takes no parameters and returns nothing, e.g.: void myHandler()

Returns

None

Example

For slave
// by Nicholas Zambetti <http://www.zambetti.com>

#include <Wire.h>

void setup()
{
Wire.begin(2);                // join i2c bus with address #2
Wire.onRequest(requestEvent); // register event
}

void loop()
{
delay(100);
}

// function that executes whenever data is requested by master
// this function is registered as an event, see setup()
void requestEvent()
{
Wire.send(“hello “); // respond with message of 6 bytes
// as expected by master
}

See Also

UIST2010

UIST 2010 held in New York this year. It was really great experience for me. We had great keynote speakers and paper presentations. Also, this years student contest was working on some really cool LCD keyboards(Microsoft) for the students. They can easily turn in to nice game controllers or assistive technology devices, as well as nice web interface controllers.

Keynote speakers of the conference were Marvin Minsky (The Interested Interface), Natalie Jeremijenko(Connected Environments), Jaron Lanier(The Engineering of Personhood). It was really good to hear them since their topics were almost completely different then papers that were presented.

Rest of this post ill be about some of the demos or papers that I really liked and can be inspirational.

Imaginary interfaces

Imaginary interfaces is an interface that user draws/writes on the air, by using his her right hand thumb and index finger as boarders of a paper. Their drawings/writing are send to other users screen. This is useful when you are describing something over the phone. Technology behind this application is an IR  camera on the users chest that detects users hand gestures.

Hands-On Math: A page-based multi-touch and pen desktop for technical work and problem solving

This paper was about an application that helps people to lear math. This idea is based on advantages of black boards which is gathering around it for discussing a formula or drawing/writing. I think these elements are helpful for learning. Hands-on-Math is a touch screen interface that works with pen. It is designed as a math interface. User can write formulas, scale formulas. By touching to one with and using other finger they can actually change the scale of formula. I really like the new gestures that they are using for interaction.

You can check out their power point in here.`

Pen + Touch = New Tools

This is a paper from Microsoft Research Labs. They are trying to bring the pen input to multi-touch screen experience. Also, they don’t see this is a base for the interaction, but they see it as an additional input that you can use with fingers. Pen can be used whenever it is necessary such as while you are reading a book from your iPad. It is great to take notes or underlying stuff with your pen.

TurKit: Human Computation Algorithms on Mechanical Turk

I think this is a really interesting idea. In this paper, they are explaining ways to use mechanical turkers in an algorithm to get the best results. They have a simple example with a hand written paper. In the first iteration no one can read the paper. However, by changing some restrictions, they are getting positive results in few iterations.

MAI Painting Brush: An Interactive Device that Realizes the Taste and Feeling of Real Painting

This is a painting device that lets users to make virtual paintings in 3D space. User is using virtual reality glasses and painting by using spacial brush. Also, user can change the tip of the brush which seems really helpful.

SqueezeBlock: Using Virtual Springs in Mobile Devices for Eyes-Free Interaction

This is an interface that gives user a tactile feedback by changing the strength of the springs. They see this device as a phone interface but I think there are other useful ways to use it.

You can see their paper in here.

Gilded Gait: Reshaping the Urban Experience with Augmented Footsteps

This paper is about giving tactile feedback(vibration) to user through his feet. By using accelerometer, switches etc, It lets user to feel how the surface like. I think this sort of feedback would be also useful for people who has visual disabilities. You can easily implement a GPS device on top of this and just help them navigate.

Jogging over a Distance between Europe and Australia

This is a really cool idea to connect people in different locations. People usually don’t like running alone. So, this application is trying to connect people in real time over the phone. According to users speed they can hear each other from behind or front by panning. However, one of the down side of this project, I really don’t like talking while I am running. It really effects runners breath.

FootLoose: Exploration and implementation of practical accelerometer-enabled foot gestures

I think this one was one of my favorites and I think closest one to ITP. It is an iPhone app that uses accelerometer as an input for controlling the phone. Let’ say that you are carrying lots of bags and you cannot answer to your phone. You are already using your head set, after few some dance moves, you  can actually answer your phone and talk to your friends. I feel like, some one can make really nice iPhone games by using this idea that is controlled by dance movements.

Multitoe: High-Precision Interaction with Back-Projected Floors Based on High-Resolution Multi-Touch Input

This is a great idea and seems like they have gone through a lot with this project. I think this could be really fun and useful in many public spaces like science museums.

Combining Multiple Depth Cameras and Projectors for Interactions On, Above and Between Surfaces

I think this is the most fascinating project that I saw during the UIST. It is based on 3 3D cameras and 3 projectors in a room. By using the 3 different 3D camera angle they are able to track the user and other users in the room. Also, by using the projector, they are able to turn the whole room to a computer. It makes it really easy to create interaction in 3D space  Check out the video for the rest.

MudPad

MudPad is a nice user interface that gives users tactile feedbacks by using localized vibration. I can see this product in many assistive technology projects or games. For more, click here.

Pinstripe: Eyes-free Continuous Input Anywhere on Interactive Clothing

unfortunately, I couldn’t see the real demo for this project. However, their paper sounds really cool. It is a way to detect the area that you are pinching on a garment. They are claiming  that, they can make it as large as they need. For more information, here is a link for the paper.

Principles of Visual Design Assignment

This week, for principles of visual design assignment, I am going to analyze www.hurriyet.com.tr. Hurriyet.com.tr is a web site for a physical newspaper. It is publishing in my home country, and it is the source that I keep contact with daily news in my home country.

It is always hard to visualize news sources on the web.  Because it needs to be flexible for news updates and also since it is online there are a lot to share with user because page number and cost is not a problem anymore. I really don’t like the design of this site, however it is a very popular page in my country. These are the reasons why I am analyzing this page.

In general, page seems really crowded with pictures, texts and advertisements. It is really hard to figure out your navigation in the site.

I thin their decision is to make a site with lots of news and pictures that user can navigate through and find what they are looking for. However, this becomes so overwhelming with the design of the site.

In general, they are using two typeface. However, there are some parts that they need to use different typefaces. These are usually logos. However, if you look at to the left bottom corner of the page, they are using an unnecessary logo inside the menu boxes. This logo includes unnecessary color and brings a new typeface to the site.

In general, these are the colors that they are using. Except green, it looks ok. However, color seems like something that they added later in to the design. Other then these 4, there are also more colors  that comes with the pictures and advertisements.

At first, except that weird box on the right site, alignment looks OK. However, when you start drawing empty boxes on to text boxes, then it is not hard to understand there is not a general alignment in the site. Only good thing about alignment, they are at least aligning to the left and right except that box on the right site.

Hierarchically, I think they did a good job, cause I usually just read the top part of the site, and look through the images. If I can’t find something interesting at the top part, then I go down the bottom.

First solid prototype

I made a post about different fabrics that I tried for my prototype. I have decided on using “stretch conductive fabric”. Reasons why I chose it are, my first simple test for galvanic skin response with ”stretch conductive fabric” gave good results, also it is description says that it is a really good conductive material and good for electrodes.

sany0076

So, I attached two pieces of fabric on a wristband, and I used a lilypad and 300k as resistor.

sany0080
sany0083sany0086
And this is another image to show how my data represented for now, and how it changes…
sany0084

Here are 2 videos from my prototype. One with using fingers, one with using wrist. I was taking rapid breaths while I am experimenting to see the change in graph. When I take rapid breaths graph goes up, when ever I give all of my breath out, graph goes down.