Posts Tagged ‘ nyu ’

Cambria to Monterey

Cambria to Monterey: 103 miles

http://maps.google.com/maps?saddr=35.565982,-121.07833&daddr=moneterey&geocode=FZ6xHgIdxn3I-A%3BFa55LgId7Ai8-CmR-7VwUuSNgDFePUrYCUlI7g&sll=35.608185,-119.816895&sspn=1.098637,5.625&ie=UTF8&ll=36.081455,-121.50837&spn=1.03751,0.85984&t=m&output=embed

View Larger Map

I woke up to a foggy Monday morning in Cambria but Tim the innkeeper informed me that the fog would blow away by 10am. I had coffee and some chocolate chip muffins I packed myself and headed to San Simeon and Hearst Castle.

california 8

california 1
The road footage of the previous day was garbage so I made a few modifcations to the camera setup and it worked great. It was a short 10 miles to Hearst Castle so I got there on the very first tour of the day.

Hearst Castle and William Hearst is the inpsiration for Orson Welles’ classic, Citizen Kane. Sadly I missed the evening screening of the film at Hearst Castle last Friday but today will do.

california 4

The castle was donated by the Hearst Corporation to the State of California in 1957 and is now managed by the California State Parks. They do still own the surrounding land where Black Angus cows graze along with zebras who are descendants of the exotic animals that Hearst once populated the area with.

california 7

The house was a continuos project of Hearst and architect Julia Morgan.

california 3

california 6

I’ll save the details about the tour for the documentary but on with the road. It was 10am by the time I got back to car and started to set the car back up. Not far from San Simeon are Elephant Seal colonies where I stopped by and took some pictures and video before I set on for the challenge of the day. Big Sur.

Big Sur is the most exciting and deadliest part of California 1 that requires you to check road conditions before setting off as there could be landslides. The fog had moved inland and roads were dry and the sun was out which made for an excellent day to drive up.

The road winds it’s way along the California coast with tight switchbacks and hairpin turns with a suggested speed limit of 20mph. But with the car I was driving I could make the turns at 35-40mph.

Throughout this part of the trip, there were three times where traffic was controlled due to construction. One of the things to keep in mind about the highway is your rearview mirror. There are numerous turnouts for slower cars to use to let faster cars through. Sadly, many drivers don’t know this thus making the trip a lot longer than it should.

After passing these vehicles eventually, it was now time to take note of the damage on the road. Orange traffic cones line the edges of highway marking places where rock slides had destroyed the road recently and taking the steel barriers with it. A huge bridge construction effort is underway thus making us wait for at least 15 minutes before we could pass.

Just last year, the road was closed for seven months due to damage. It’s a sad reality that such a beautiful road is being reclaimed by the sea.

The Mazda RX-3 2012 base model worked perfectly. The front wheel drive of the car enabled ease of turning the corners and just having fun.

California 1 is not a place you would want to drive during the summer. The road is filled with people going back and fourth along with numerous bike tourers. Most bike tours go from north to south which to predominantly downhill coupled with favorable tail winds and the coast on your side, thus making a perfect ride. Being on a bike also provides the opportunity to experience the road like no other. You can stop in the middle of Bixby Bride and enjoy the view. Something you can’t do in car. It’s something I’d like to do but I’ll get into shape first before trying this out. Michael Ballard over at crazyguyonabike.com has a great journal on his trip down the coast in 2010.

california 9

I hit Carmel-by-the-Sea and Monterey around lunch time and proceeded to Pebble Beach. Pebble Beach is home to the famous 17 Mile Drive and Golf Course. It’s a private area thus requiring all vehicles to pay $9.50 to take in the scenery. Bikes get in for free. No motorcycles though.

california 11

california 10

The views here are amazing as the Pacific waves crash on the jagged rocks of Pebble Beach. Pebble Beach is also home to the world’s most photographed tree. The Lone Cypress tree which is estimated to be 250 years old is also the trademarked tree of Pebble Beach thus making it illegal to photograph the tree for commercial purposes. The tree is actually being held in place by steel cables to prevent it from falling into the ocean. At least the cables aren’t that visible.

california 13

california 17

I drove into the tourist area of Moneterey and checked in for the night.

Not far from Monterey is Salinas, birthplace of John Steinbeck and home to the Laguna Seca Racing circuit. But it is over in Monterey where Steinbeck made famous the Cannery Row, where fishing boats that netted the waters of Moneterey Bay docked and sardines were packed until over fishing killed the industry. Now all that remains is a tourist row and the Monterey Bay Aquarium.

california 16

I’ve never been to the aquarium before so I made it a point to visit on this trip. It’s not Sea World, but the collection is still amazing. Their collection of jellyfish and sea horses is just stunning. The aquarium highlights the biodiversity of the Monterey coast (which by the way is a protected area that stretches all the way to San Simeon).

california 18

california 20

california 21

california 22

california 23

california 28

california 29

california 30

After I had cioppino for lunch the rain and the wind started to come in. I then made my way to San Francisco up along the 1 since it was a weekday and I knew the traffic that was ahead of me as I drove up the peninsula. This version of the documentary will end here but I will keep adding on to it as I come across more stories and information about the coast.

Tokyo 1954

Notes to follow.

Ultra HD Digital 8k

It’s been a long time coming but it’s application remains to be seen.

In 2006, researchers at the NHK demonstrated in Las Vegas a transmission of Ultra High Definition Television with 22.2 surround sound. THe broadcast was from Tokyo to Osaka via an IP network running at 1Gbps. Uncompressed, the sound signal alone an at 20 Mbps while the video signal ran at 24 Gbps.

Current broadcast standards runs at MPEG-2 compressions with a maximum resolution of 1920 x 1080. Ultra HD runs at 7680 x 4320 pixels.

Developed by the NHK in Japan, ultra HD has 4000 scanning lines compared to just 1080 for the current broadcast system.

In 2007, SMPTE approves the Ultra HDTV as a standard format.

The BBC will broadcast the London Summer Olympics in ultraHD.

Each frame is equal to 33 Megapixels.

I can see this as the digital IMAX but more. The 100 degree viewing angle allows for an image that can simulate human perception. It’s quite hard to describe since the image is huge but the experience is almost realistic.

This type of imaging is a step forward to building that holodeck. The amount of detail that the resolution provides will be able to show more infomation for computers to see. Although current limitations would be enough processing power to process the data.

 

Santa Monica to Cambria

http://maps.google.com/maps?f=d&source=s_d&saddr=Santa+Monica,+CA&daddr=34.05997,-118.69429+to:35.565982,-121.07833&hl=en&geocode=FX4YBwIdyffv-CkZAJHCzqTCgDGr9SP_tQoXtA%3BFcK2BwIdbt7s-Cl3iv-i6h_ogDHsjjKuK3UCng%3BFZ6xHgIdxn3I-A&aq=0&oq=my+&sll=34.042419,-118.692169&sspn=0.264002,1.40625&mra=dpe&mrsp=1&sz=10&via=1&dmli=2&glp=1&ie=UTF8&ll=34.042419,-118.692169&spn=0.264002,1.40625&t=m&output=embed

View Larger Map

So my journey begins on the part of California 1 on Santa Monica beach. Let’s get an overview of the purpose of trip. Two years ago I fell in love with the road here in California. It provided amazing landscapes and driving opportunity that made me want to make this. My dream is to eventually photograph/ document various coastlines along the Pacific Ocean.

Santa Monica Pier

The Pacific Coast Highway is a an All-American Road, the other one is Route 66. Highway 1 shares it with the US 101 and even Interstate 5 at some points. But the most beautiful parts of California 1 is part where I’m driving tomorrow, the area between Big Sur and Monterey.

But for today, it’s from Santa Monica to Cambria.

With gas prices at an al time high, hybrid cars were nowhere to be found at the Hertz rental. I was shocked that they offered a Hyundai as a replacement. I know it’s a great car and great gas mileage. But seriously?!?. So I went for the Mazda RX-3 sedan version. My friend had this car and it was awesome and so was this one. Very sporty and even has a manual transmission option should I want it. Yes please.

Cameras used for this are the Sony NEX C3, Panasonic Lumix LX-5 and a Nikon D200. Mounting them was a challenge. The focal length of the Sony was a bit short plus all the shaking in car with the roads meant that most of my footage from day 1 is useless. Salvageable but it will take a lot of work.

Lumix video was great but ran out of batteries. I do have an audio recording of the whole thing so that’s good. As for the D200, I’l need to make a few adjustments today to get better images but the system is working as I had hoped.

The trip took me roughly four hours and it was long. Most of it on the US101 to get better time as the sun was setting and night driving will make shooting difficult.

Getting out of Southern California is easier said than done. For some reason the 405 was not moving. Good thing I decided to start from Santa Monica. This allowed me to bypass that area completely and take the PCH up to Malibu, hand a right at Pepperdine University and hit the 101 from there.

The weather was great so everyone was out at the beach. Which meant traffic.

The Malibu canyons gave a great opportunity to get a feel for the car with it’s twists and turns and I was happy. The rental reminds me of my own car back home.

I had been running on coffee with my “jetlag” from New York the whole day and things were getting dicey at around Santa Maria, a good 1 1/2 hours from Cambria. Gas mileage wasn’t so good. I stopped for gas in San Luis Obispo and filled up 7.5 gallons.

It was well into the night once I got to the inn in Cambria where I would spend the night to recharge and prepare for the next day.

Cameras of the world 5 years from now

There are views that more cameras out there could mean two things. One is that We can finally get a grasp on what’s going on in the world. No longer can dictators and criminals hide from us. For once we can finally generate our own opinions on subjects that before, took an army of journalists to capture and analyze. We can finally have our own opinion. Then of course there’s the downside. It’s who is behind the cameras is the scary part. We already live part in that world. Our every movement is captured and stored into servers for who knows how long.

Cameras enabled it’s creators to preserve their time and space and it continues to do so today. The 2011 Japan Earthquake was so devastating that we were getting live images as the tsunami swept through the northern region. Users shared their videos of the quake as it was happening and for the first time, the word could see terrible disaster live.

I for one would like to be optimistic about where technology is leading us in terms of cameras. I long for the images of the old cities in my home. I wish I could re-create the city the way it was before World War II or even better, re-create Old Manila during the Spanish era. We would be able to take a walk into history so to speak, understand and experience the place and time where my grandparents and great-grandparents lived. Like a living holodeck based on information from the past.

Cameras are something we fear about today. But it’s something that our descendants would look for in the future.

Adaptaion

apesketches

This week in Comics, we adapted T.C. Boyle’s short story, “The Ape in Retirement” into 6 panels of comic book form. It was a challenge in figuring out what to keep and what to take away and my process has resulted in this.

While on the train, I sketched out scenes that were important and significant in moving the story forward. Even the it was told through the eyes of the female protagonist, Beatrice. The actions all belong to Konrad the ape.

apesketches

This is the final sketch in panels. I chose to focus on the face of Konrad for his ability to express emotions similar to humans. Though my sketches are crude, I was making an attempt to show how Konrad was reacting to the events around him. Given more time I would narrated it through Beatrice’s words but Konrad’s actions.

Apple bite

Watch here

 

import oscP5.*;
OscP5 oscP5;

//crane[] crane;
PImage crane1;
PImage crane2;
PImage crane3;
PImage crane4;
PImage crane5;
PImage apple;

String crane= "crane1, crane2, crane3, crane4, crane5";

PVector posePosition;
boolean found;
float eyeLeftHeight;
float eyeRightHeight;
float mouthHeight;
float mouthWidth;
float nostrilHeight;
float leftEyebrowHeight;
float rightEyebrowHeight;

float[] chew = new float [5];
//float[] crane = new float [crane1, crane2, crane3, crane4, crane5];

//float chew = 0;

PVector[] meshPoints;

float poseScale;

void setup() {
size(640, 480);
frameRate(30);

for (int i =0; i < chew.length; i++) {
chew[i] += 1;
}

// crane = new crane [crane1, crane2, crane3, crane4, crane5];
crane1 = loadImage("crane01.JPG");
crane2 = loadImage("crane02.JPG");
crane3 = loadImage("crane03.JPG");
crane4 = loadImage("crane04.JPG");
crane5 = loadImage("crane05.JPG");
apple = loadImage("apple.jpg");

meshPoints = new PVector[66];

for (int i = 0; i < meshPoints.length; i++) {
meshPoints[i] = new PVector();
}

oscP5 = new OscP5(this, 8338);
oscP5.plug(this, "mouthWidthReceived", "/gesture/mouth/width");
oscP5.plug(this, "mouthHeightReceived", "/gesture/mouth/height");
oscP5.plug(this, "eyebrowLeftReceived", "/gesture/eyebrow/left");
oscP5.plug(this, "eyebrowRightReceived", "/gesture/eyebrow/right");
oscP5.plug(this, "eyeLeftReceived", "/gesture/eye/left");
oscP5.plug(this, "eyeRightReceived", "/gesture/eye/right");
oscP5.plug(this, "jawReceived", "/gesture/jaw");
oscP5.plug(this, "nostrilsReceived", "/gesture/nostrils");
oscP5.plug(this, "found", "/found");
oscP5.plug(this, "poseOrientation", "/pose/orientation");
oscP5.plug(this, "posePosition", "/pose/position");
oscP5.plug(this, "poseScale", "/pose/scale");
oscP5.plug(this, "loadMesh", "/raw");
}

void draw() {
background(0);
stroke(100);
/*
for (int i=0; i 1) {
image(apple, 0, 0, 640, 480);
}
//}
/* if (found) {
fill(255);
for (int i = 0; i 0) {
PVector prev = meshPoints[i-1];
line(prev.x, prev.y, p.x, p.y);
}
}*/

/translate(posePosition.x, posePosition.y);
scale(poseScale);
noFill();
// ellipse(0,0, 3,3);
ellipse(-20, eyeLeftHeight * -9, 20, 7);
ellipse(20, eyeRightHeight * -9, 20, 7);
ellipse(0, 20, mouthWidth
3, mouthHeight * 3);
ellipse(-5, nostrilHeight * -1, 7, 3);
ellipse(5, nostrilHeight * -1, 7, 3);
rectMode(CENTER);
fill(0);
rect(-20, leftEyebrowHeight * -5, 25, 5);
rect(20, rightEyebrowHeight * -5, 25, 5);
*/
}
//}

public void mouthWidthReceived(float w) {
// println("mouth Width: " + w);
mouthWidth = w;
}

public void mouthHeightReceived(float h) {
println("mouth height: " + h);
mouthHeight = h;
}

public void eyebrowLeftReceived(float h) {
// println("eyebrow left: " + h);
leftEyebrowHeight = h;
}

public void eyebrowRightReceived(float h) {
// println("eyebrow right: " + h);
rightEyebrowHeight = h;
}

public void eyeLeftReceived(float h) {
// println("eye left: " + h);
eyeLeftHeight = h;
}

public void eyeRightReceived(float h) {
// println("eye right: " + h);
eyeRightHeight = h;
}

public void jawReceived(float h) {
// println("jaw: " + h);
}

public void nostrilsReceived(float h) {
// println("nostrils: " + h);
nostrilHeight = h;
}

public void found(int i) {
println("found: " + i); // 1 == found, 0 == not found
found = i == 1;
}

public void posePosition(float x, float y) {
//println("pose position\tX: " + x + " Y: " + y );
posePosition = new PVector(x, y);
}

public void poseScale(float s) {
//println("scale: " + s);
poseScale = s;
}

public void poseOrientation(float x, float y, float z) {
//println("pose orientation\tX: " + x + " Y: " + y + " Z: " + z);
}

public void loadMesh(float x0, float y0, float x1, float y1, float x2, float y2, float x3, float y3, float x4, float y4, float x5, float y5, float x6, float y6, float x7, float y7, float x8, float y8, float x9, float y9, float x10, float y10, float x11, float y11, float x12, float y12, float x13, float y13, float x14, float y14, float x15, float y15, float x16, float y16, float x17, float y17, float x18, float y18, float x19, float y19, float x20, float y20, float x21, float y21, float x22, float y22, float x23, float y23, float x24, float y24, float x25, float y25, float x26, float y26, float x27, float y27, float x28, float y28, float x29, float y29, float x30, float y30, float x31, float y31, float x32, float y32, float x33, float y33, float x34, float y34, float x35, float y35, float x36, float y36, float x37, float y37, float x38, float y38, float x39, float y39, float x40, float y40, float x41, float y41, float x42, float y42, float x43, float y43, float x44, float y44, float x45, float y45, float x46, float y46, float x47, float y47, float x48, float y48, float x49, float y49, float x50, float y50, float x51, float y51, float x52, float y52, float x53, float y53, float x54, float y54, float x55, float y55, float x56, float y56, float x57, float y57, float x58, float y58, float x59, float y59, float x60, float y60, float x61, float y61, float x62, float y62, float x63, float y63, float x64, float y64, float x65, float y65) {
println("loading mesh...");
meshPoints[0].x = x0;
meshPoints[0].y = y0;
meshPoints[1].x = x1;
meshPoints[1].y = y1;
meshPoints[2].x = x2;
meshPoints[2].y = y2;
meshPoints[3].x = x3;
meshPoints[3].y = y3;
meshPoints[4].x = x4;
meshPoints[4].y = y4;
meshPoints[5].x = x5;
meshPoints[5].y = y5;
meshPoints[6].x = x6;
meshPoints[6].y = y6;
meshPoints[7].x = x7;
meshPoints[7].y = y7;
meshPoints[8].x = x8;
meshPoints[8].y = y8;
meshPoints[9].x = x9;
meshPoints[9].y = y9;
meshPoints[10].x = x10;
meshPoints[10].y = y10;
meshPoints[11].x = x11;
meshPoints[11].y = y11;
meshPoints[12].x = x12;
meshPoints[12].y = y12;
meshPoints[13].x = x13;
meshPoints[13].y = y13;
meshPoints[14].x = x14;
meshPoints[14].y = y14;
meshPoints[15].x = x15;
meshPoints[15].y = y15;
meshPoints[16].x = x16;
meshPoints[16].y = y16;
meshPoints[17].x = x17;
meshPoints[17].y = y17;
meshPoints[18].x = x18;
meshPoints[18].y = y18;
meshPoints[19].x = x19;
meshPoints[19].y = y19;
meshPoints[20].x = x20;
meshPoints[20].y = y20;
meshPoints[21].x = x21;
meshPoints[21].y = y21;
meshPoints[22].x = x22;
meshPoints[22].y = y22;
meshPoints[23].x = x23;
meshPoints[23].y = y23;
meshPoints[24].x = x24;
meshPoints[24].y = y24;
meshPoints[25].x = x25;
meshPoints[25].y = y25;
meshPoints[26].x = x26;
meshPoints[26].y = y26;
meshPoints[27].x = x27;
meshPoints[27].y = y27;
meshPoints[28].x = x28;
meshPoints[28].y = y28;
meshPoints[29].x = x29;
meshPoints[29].y = y29;
meshPoints[30].x = x30;
meshPoints[30].y = y30;
meshPoints[31].x = x31;
meshPoints[31].y = y31;
meshPoints[32].x = x32;
meshPoints[32].y = y32;
meshPoints[33].x = x33;
meshPoints[33].y = y33;
meshPoints[34].x = x34;
meshPoints[34].y = y34;
meshPoints[35].x = x35;
meshPoints[35].y = y35;
meshPoints[36].x = x36;
meshPoints[36].y = y36;
meshPoints[37].x = x37;
meshPoints[37].y = y37;
meshPoints[38].x = x38;
meshPoints[38].y = y38;
meshPoints[39].x = x39;
meshPoints[39].y = y39;
meshPoints[40].x = x40;
meshPoints[40].y = y40;
meshPoints[41].x = x41;
meshPoints[41].y = y41;
meshPoints[42].x = x42;
meshPoints[42].y = y42;
meshPoints[43].x = x43;
meshPoints[43].y = y43;
meshPoints[44].x = x44;
meshPoints[44].y = y44;
meshPoints[45].x = x45;
meshPoints[45].y = y45;
meshPoints[46].x = x46;
meshPoints[46].y = y46;
meshPoints[47].x = x47;
meshPoints[47].y = y47;
meshPoints[48].x = x48;
meshPoints[48].y = y48;
meshPoints[49].x = x49;
meshPoints[49].y = y49;
meshPoints[50].x = x50;
meshPoints[50].y = y50;
meshPoints[51].x = x51;
meshPoints[51].y = y51;
meshPoints[52].x = x52;
meshPoints[52].y = y52;
meshPoints[53].x = x53;
meshPoints[53].y = y53;
meshPoints[54].x = x54;
meshPoints[54].y = y54;
meshPoints[55].x = x55;
meshPoints[55].y = y55;
meshPoints[56].x = x56;
meshPoints[56].y = y56;
meshPoints[57].x = x57;
meshPoints[57].y = y57;
meshPoints[58].x = x58;
meshPoints[58].y = y58;
meshPoints[59].x = x59;
meshPoints[59].y = y59;
meshPoints[60].x = x60;
meshPoints[60].y = y60;
meshPoints[61].x = x61;
meshPoints[61].y = y61;
meshPoints[62].x = x62;
meshPoints[62].y = y62;
meshPoints[63].x = x63;
meshPoints[63].y = y63;
meshPoints[64].x = x64;
meshPoints[64].y = y64;
meshPoints[65].x = x65;
meshPoints[65].y = y65;
}

void oscEvent(OscMessage theOscMessage) {
if (theOscMessage.isPlugged()==false) {
println("UNPLUGGED: " + theOscMessage);
}
}

Face tracking projects

  • In car face tracking – to detect driver fatigue and other behavior.
  • PS Eye head tracking to move the POV of the game according to the position of the head.
  • Example
  • Facial tracking in cars to control objects in the car and identify the driver.
  • Facial recognition for identification of people in photos as depicted in Apple’s iPhoto and Aperture
  • To target objects in real life and fire deadly missles like Apache Helicopter Pilots.

Moments in panels

This week’s homework examines on the different ways we can combine words and images.
comicspanel1interdependent
Interdependent

comicspanel1wordspecific

Word Specific

comicspanel1parallel

Parallel

comicspanel1picturespecific

Picture specific

Personally, I preferred the picture specific method in this particular story of me crashing my bike. I think the the ability to convey the story using images conveys more emotion. I leave it to the imagination of the reader on how painful the crash was or the feeling of your life flashing right before your eyes. No words can properly explain that.

Skeletal tracking

This week Kim Ash and I worked together on the skeletal tracking of the Kinect using OpenNI. The idea is when you reach a pose, a “nuclear” explosion occurs. Using the code sample from ITP Resident Greg Borenstien’s book “Making Things See, 2011”, it was fairly straightforward enough to get the skeletal tracking in place.

We wanted the explosion to occur once the two outstretched arms were in place.

skeletal tracking

In this image, we just wanted to track the arms. This is possible using the OpenNI commands:

  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_HAND, rightHand);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, rightElbow);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, rightShoulder);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, leftShoulder);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, leftElbow);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_HAND, leftHand);

Then by using an “if” statement, it was just measuring the position of the joints that would give the outstretched arms pose.

if (rightElbow.y > rightShoulder.y && rightElbow.x > rightShoulder.x && leftElbow.y > leftShoulder.y && leftElbow.x > leftShoulder.x) {
stroke(255);
}
else {
tint(255, 255);
image(cloud, 840, 130, 206, 283);
explosion.play();
// stroke(255, 0, 0);
}
kinect.drawLimb(userId, SimpleOpenNI.SKELRIGHTSHOULDER, SimpleOpenNI.SKELRIGHTELBOW);
kinect.drawLimb(userId, SimpleOpenNI.SKELLEFTSHOULDER, SimpleOpenNI.SKELLEFTELBOW);

  // right hand above right elbow
  // AND
  // right hand right of right elbow
  if (rightHand.y > rightElbow.y && rightHand.x > rightElbow.x && leftHand.y > leftElbow.y && leftHand.x > leftElbow.x) {
    stroke(255);
  }
  else {
     tint(255, 255);
     image(cloud, 840, 130, 206, 283);
     explosion.play();
 //   stroke(255, 0, 0);
  }
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HAND, SimpleOpenNI.SKEL_RIGHT_ELBOW);
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HAND, SimpleOpenNI.SKEL_LEFT_ELBOW);
}

Which results in this:

We wanted a better screen capture but for some reason this sketch didn’t like Ambrosia’s SnapzPro.

Full code:

import ddf.minim.;
import ddf.minim.signals.
;
import ddf.minim.analysis.;
import ddf.minim.effects.
;

Minim minim;
AudioPlayer explosion;

import SimpleOpenNI.*;
SimpleOpenNI kinect;
PImage back;
PImage cloud;

void setup() {
size(640*2, 480);
back = loadImage(“desert.png”);
cloud = loadImage(“cloud.png”);
// imageMode(CENTER);

minim = new Minim(this);
explosion = minim.loadFile(“explosion.mp3”);

kinect = new SimpleOpenNI(this);
kinect.enableDepth();
kinect.enableRGB();
kinect.enableUser(SimpleOpenNI.SKELPROFILEALL);
strokeWeight(5);
}

void draw() {
background(0);
kinect.update();
image(kinect.depthImage(), 0, 0);
// image(kinect.rgbImage(),640,0);
image(back, 640, 0, 640, 480);

IntVector userList = new IntVector();
kinect.getUsers(userList);
if (userList.size() > 0) {
int userId = userList.get(0);
if ( kinect.isTrackingSkeleton(userId)) {
PVector rightHand = new PVector();
PVector rightElbow = new PVector();
PVector rightShoulder = new PVector();
PVector leftHand = new PVector();
PVector leftElbow = new PVector();
PVector leftShoulder = new PVector();

  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_HAND, rightHand);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, rightElbow);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, rightShoulder);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, leftShoulder);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, leftElbow);
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_HAND, leftHand);

  // right elbow above right shoulder
  // AND
  // right elbow right of right shoulder
  if (rightElbow.y > rightShoulder.y && rightElbow.x > rightShoulder.x && leftElbow.y > leftShoulder.y && leftElbow.x > leftShoulder.x) {
    stroke(255);
  }
  else {
     tint(255, 255);
     image(cloud, 840, 130, 206, 283);
     explosion.play();
   // stroke(255, 0, 0);
  }
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_ELBOW);
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_LEFT_ELBOW);

  // right hand above right elbow
  // AND
  // right hand right of right elbow
  if (rightHand.y > rightElbow.y && rightHand.x > rightElbow.x && leftHand.y > leftElbow.y && leftHand.x > leftElbow.x) {
    stroke(255);
  }
  else {
     tint(255, 255);
     image(cloud, 840, 130, 206, 283);
     explosion.play();
 //   stroke(255, 0, 0);
  }
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HAND, SimpleOpenNI.SKEL_RIGHT_ELBOW);
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HAND, SimpleOpenNI.SKEL_LEFT_ELBOW);
}

}
}

// user-tracking callbacks!
void onNewUser(int userId) {
println(“start pose detection”);
kinect.startPoseDetection(“Psi”, userId);
}

void onEndCalibration(int userId, boolean successful) {
if (successful) {
println(” User calibrated !!!”);
kinect.startTrackingSkeleton(userId);
}
else {
println(” Failed to calibrate user !!!”);
kinect.startPoseDetection(“Psi”, userId);
}
}

void onStartPose(String pose, int userId) {
println(“Started pose for user”);
kinect.stopPoseDetection(userId);
kinect.requestCalibrationSkeleton(userId, true);
}

void keyPressed() {

switch(key)
{
case ‘ ‘:
kinect.setMirror(!kinect.mirror());
break;
}
}

void close () {
explosion.close();
minim.stop();
super.stop();
}