Archive for February, 2012
In two weeks I’ll be embarking on a journey that will take me across the country. But after taking a breather and exmaining my capabilities, I’ve shortened the journey and changed it.
The cross country train ride would have been amazing since it’s not a road, it is not part of Google Earth Street View. That alone was difficult. Not to mention I mapped the ride to Chicago and it was very long.
I’ve decided to drive up the California coast instead. Specifically the area of California Highway 1 between San Luis Obispo and San Francisco.
I drove there the first time in the spring of 2010 in a Nissan Hybrid with my sister. This time around I’ll be travelling knowing what I’ll need. The difference today is that I’ll be equipping the car I’ll be using with a number of cameras that would almost make it look like a Google Street View car but in simpler terms. I’ve marked the series of stops and places I intend to visit and the entire journey will take me three days.
There’s something to be said about a road trip. This documentary narrates some of the history of the road and will attempt to capture the unique views only visible on this road. One of the driving factors for me on this documentary is the possiblity that this road may no longer exist in it’s current state in the next ten years. Though no fault of man. The Pacific ocean is slowly eating away at the cliffs and eroding the land beneath the it.
The drive is also very special. The trip will take me to a twisting part of the highway where I never bothered to look at the speed limit. The speed limit is a distraction at that point and it’s not posted anyway. I think it would be even better if I can get my hands on a manual transmission car for this, but I doubt I’ll be able to find one.
I have part of the script for the narrative bits and rest would be in the car. This is very exciting.
There are views that more cameras out there could mean two things. One is that We can finally get a grasp on what’s going on in the world. No longer can dictators and criminals hide from us. For once we can finally generate our own opinions on subjects that before, took an army of journalists to capture and analyze. We can finally have our own opinion. Then of course there’s the downside. It’s who is behind the cameras is the scary part. We already live part in that world. Our every movement is captured and stored into servers for who knows how long.
Cameras enabled it’s creators to preserve their time and space and it continues to do so today. The 2011 Japan Earthquake was so devastating that we were getting live images as the tsunami swept through the northern region. Users shared their videos of the quake as it was happening and for the first time, the word could see terrible disaster live.
I for one would like to be optimistic about where technology is leading us in terms of cameras. I long for the images of the old cities in my home. I wish I could recreate the city the way it was before World War II or even better, recreate Old Manila during the Spanish era. We would be able to take a walk into history so to speak, understand and experience the place and time where my grandparents and greatgrandparents lived. Like a living holodeck based on information from the past.
Cameras are something we fear about today. But it’s something that our descendants would look for in the future.
This week in Comics, we adapted T.C. Boyle’s short story, “The Ape in Retirement” into 6 panels of comic book form. It was a challenge in figuring out what to keep and what to take away and my process has resulted in this.
While on the train, I sketched out scenes that were important and significant in moving the story forward. Even the it was told through the eyes of the female protagonist, Beatrice. The actions all belong to Konrad the ape.
This is the final sketch in panels. I chose to focus on the face of Konrad for his ability to express emotions similar to humans. Though my sketches are crude, I was making an attempt to show how Konrad was reacting to the events around him. Given more time I would narrated it through Beatrice’s words but Konrad’s actions.
import oscP5.*;
OscP5 oscP5;
//crane[] crane;
PImage crane1;
PImage crane2;
PImage crane3;
PImage crane4;
PImage crane5;
PImage apple;
String crane= "crane1, crane2, crane3, crane4, crane5";
PVector posePosition;
boolean found;
float eyeLeftHeight;
float eyeRightHeight;
float mouthHeight;
float mouthWidth;
float nostrilHeight;
float leftEyebrowHeight;
float rightEyebrowHeight;
float[] chew = new float [5];
//float[] crane = new float [crane1, crane2, crane3, crane4, crane5];
//float chew = 0;
PVector[] meshPoints;
float poseScale;
void setup() {
size(640, 480);
frameRate(30);
for (int i =0; i < chew.length; i++) {
chew[i] += 1;
}
// crane = new crane [crane1, crane2, crane3, crane4, crane5];
crane1 = loadImage("crane01.JPG");
crane2 = loadImage("crane02.JPG");
crane3 = loadImage("crane03.JPG");
crane4 = loadImage("crane04.JPG");
crane5 = loadImage("crane05.JPG");
apple = loadImage("apple.jpg");
meshPoints = new PVector[66];
for (int i = 0; i < meshPoints.length; i++) {
meshPoints[i] = new PVector();
}
oscP5 = new OscP5(this, 8338);
oscP5.plug(this, "mouthWidthReceived", "/gesture/mouth/width");
oscP5.plug(this, "mouthHeightReceived", "/gesture/mouth/height");
oscP5.plug(this, "eyebrowLeftReceived", "/gesture/eyebrow/left");
oscP5.plug(this, "eyebrowRightReceived", "/gesture/eyebrow/right");
oscP5.plug(this, "eyeLeftReceived", "/gesture/eye/left");
oscP5.plug(this, "eyeRightReceived", "/gesture/eye/right");
oscP5.plug(this, "jawReceived", "/gesture/jaw");
oscP5.plug(this, "nostrilsReceived", "/gesture/nostrils");
oscP5.plug(this, "found", "/found");
oscP5.plug(this, "poseOrientation", "/pose/orientation");
oscP5.plug(this, "posePosition", "/pose/position");
oscP5.plug(this, "poseScale", "/pose/scale");
oscP5.plug(this, "loadMesh", "/raw");
}
void draw() {
background(0);
stroke(100);
/*
for (int i=0; i 1) {
image(apple, 0, 0, 640, 480);
}
//}
/* if (found) {
fill(255);
for (int i = 0; i 0) {
PVector prev = meshPoints[i1];
line(prev.x, prev.y, p.x, p.y);
}
}*/
/translate(posePosition.x, posePosition.y);
scale(poseScale);
noFill();
// ellipse(0,0, 3,3);
ellipse(20, eyeLeftHeight * 9, 20, 7);
ellipse(20, eyeRightHeight * 9, 20, 7);
ellipse(0, 20, mouthWidth 3, mouthHeight * 3);
ellipse(5, nostrilHeight * 1, 7, 3);
ellipse(5, nostrilHeight * 1, 7, 3);
rectMode(CENTER);
fill(0);
rect(20, leftEyebrowHeight * 5, 25, 5);
rect(20, rightEyebrowHeight * 5, 25, 5);
*/
}
//}
public void mouthWidthReceived(float w) {
// println("mouth Width: " + w);
mouthWidth = w;
}
public void mouthHeightReceived(float h) {
println("mouth height: " + h);
mouthHeight = h;
}
public void eyebrowLeftReceived(float h) {
// println("eyebrow left: " + h);
leftEyebrowHeight = h;
}
public void eyebrowRightReceived(float h) {
// println("eyebrow right: " + h);
rightEyebrowHeight = h;
}
public void eyeLeftReceived(float h) {
// println("eye left: " + h);
eyeLeftHeight = h;
}
public void eyeRightReceived(float h) {
// println("eye right: " + h);
eyeRightHeight = h;
}
public void jawReceived(float h) {
// println("jaw: " + h);
}
public void nostrilsReceived(float h) {
// println("nostrils: " + h);
nostrilHeight = h;
}
public void found(int i) {
println("found: " + i); // 1 == found, 0 == not found
found = i == 1;
}
public void posePosition(float x, float y) {
//println("pose position\tX: " + x + " Y: " + y );
posePosition = new PVector(x, y);
}
public void poseScale(float s) {
//println("scale: " + s);
poseScale = s;
}
public void poseOrientation(float x, float y, float z) {
//println("pose orientation\tX: " + x + " Y: " + y + " Z: " + z);
}
public void loadMesh(float x0, float y0, float x1, float y1, float x2, float y2, float x3, float y3, float x4, float y4, float x5, float y5, float x6, float y6, float x7, float y7, float x8, float y8, float x9, float y9, float x10, float y10, float x11, float y11, float x12, float y12, float x13, float y13, float x14, float y14, float x15, float y15, float x16, float y16, float x17, float y17, float x18, float y18, float x19, float y19, float x20, float y20, float x21, float y21, float x22, float y22, float x23, float y23, float x24, float y24, float x25, float y25, float x26, float y26, float x27, float y27, float x28, float y28, float x29, float y29, float x30, float y30, float x31, float y31, float x32, float y32, float x33, float y33, float x34, float y34, float x35, float y35, float x36, float y36, float x37, float y37, float x38, float y38, float x39, float y39, float x40, float y40, float x41, float y41, float x42, float y42, float x43, float y43, float x44, float y44, float x45, float y45, float x46, float y46, float x47, float y47, float x48, float y48, float x49, float y49, float x50, float y50, float x51, float y51, float x52, float y52, float x53, float y53, float x54, float y54, float x55, float y55, float x56, float y56, float x57, float y57, float x58, float y58, float x59, float y59, float x60, float y60, float x61, float y61, float x62, float y62, float x63, float y63, float x64, float y64, float x65, float y65) {
println("loading mesh...");
meshPoints[0].x = x0;
meshPoints[0].y = y0;
meshPoints[1].x = x1;
meshPoints[1].y = y1;
meshPoints[2].x = x2;
meshPoints[2].y = y2;
meshPoints[3].x = x3;
meshPoints[3].y = y3;
meshPoints[4].x = x4;
meshPoints[4].y = y4;
meshPoints[5].x = x5;
meshPoints[5].y = y5;
meshPoints[6].x = x6;
meshPoints[6].y = y6;
meshPoints[7].x = x7;
meshPoints[7].y = y7;
meshPoints[8].x = x8;
meshPoints[8].y = y8;
meshPoints[9].x = x9;
meshPoints[9].y = y9;
meshPoints[10].x = x10;
meshPoints[10].y = y10;
meshPoints[11].x = x11;
meshPoints[11].y = y11;
meshPoints[12].x = x12;
meshPoints[12].y = y12;
meshPoints[13].x = x13;
meshPoints[13].y = y13;
meshPoints[14].x = x14;
meshPoints[14].y = y14;
meshPoints[15].x = x15;
meshPoints[15].y = y15;
meshPoints[16].x = x16;
meshPoints[16].y = y16;
meshPoints[17].x = x17;
meshPoints[17].y = y17;
meshPoints[18].x = x18;
meshPoints[18].y = y18;
meshPoints[19].x = x19;
meshPoints[19].y = y19;
meshPoints[20].x = x20;
meshPoints[20].y = y20;
meshPoints[21].x = x21;
meshPoints[21].y = y21;
meshPoints[22].x = x22;
meshPoints[22].y = y22;
meshPoints[23].x = x23;
meshPoints[23].y = y23;
meshPoints[24].x = x24;
meshPoints[24].y = y24;
meshPoints[25].x = x25;
meshPoints[25].y = y25;
meshPoints[26].x = x26;
meshPoints[26].y = y26;
meshPoints[27].x = x27;
meshPoints[27].y = y27;
meshPoints[28].x = x28;
meshPoints[28].y = y28;
meshPoints[29].x = x29;
meshPoints[29].y = y29;
meshPoints[30].x = x30;
meshPoints[30].y = y30;
meshPoints[31].x = x31;
meshPoints[31].y = y31;
meshPoints[32].x = x32;
meshPoints[32].y = y32;
meshPoints[33].x = x33;
meshPoints[33].y = y33;
meshPoints[34].x = x34;
meshPoints[34].y = y34;
meshPoints[35].x = x35;
meshPoints[35].y = y35;
meshPoints[36].x = x36;
meshPoints[36].y = y36;
meshPoints[37].x = x37;
meshPoints[37].y = y37;
meshPoints[38].x = x38;
meshPoints[38].y = y38;
meshPoints[39].x = x39;
meshPoints[39].y = y39;
meshPoints[40].x = x40;
meshPoints[40].y = y40;
meshPoints[41].x = x41;
meshPoints[41].y = y41;
meshPoints[42].x = x42;
meshPoints[42].y = y42;
meshPoints[43].x = x43;
meshPoints[43].y = y43;
meshPoints[44].x = x44;
meshPoints[44].y = y44;
meshPoints[45].x = x45;
meshPoints[45].y = y45;
meshPoints[46].x = x46;
meshPoints[46].y = y46;
meshPoints[47].x = x47;
meshPoints[47].y = y47;
meshPoints[48].x = x48;
meshPoints[48].y = y48;
meshPoints[49].x = x49;
meshPoints[49].y = y49;
meshPoints[50].x = x50;
meshPoints[50].y = y50;
meshPoints[51].x = x51;
meshPoints[51].y = y51;
meshPoints[52].x = x52;
meshPoints[52].y = y52;
meshPoints[53].x = x53;
meshPoints[53].y = y53;
meshPoints[54].x = x54;
meshPoints[54].y = y54;
meshPoints[55].x = x55;
meshPoints[55].y = y55;
meshPoints[56].x = x56;
meshPoints[56].y = y56;
meshPoints[57].x = x57;
meshPoints[57].y = y57;
meshPoints[58].x = x58;
meshPoints[58].y = y58;
meshPoints[59].x = x59;
meshPoints[59].y = y59;
meshPoints[60].x = x60;
meshPoints[60].y = y60;
meshPoints[61].x = x61;
meshPoints[61].y = y61;
meshPoints[62].x = x62;
meshPoints[62].y = y62;
meshPoints[63].x = x63;
meshPoints[63].y = y63;
meshPoints[64].x = x64;
meshPoints[64].y = y64;
meshPoints[65].x = x65;
meshPoints[65].y = y65;
}
void oscEvent(OscMessage theOscMessage) {
if (theOscMessage.isPlugged()==false) {
println("UNPLUGGED: " + theOscMessage);
}
}
I’ve made documentaries before and it’s not something entirely new. We would basically tell a story and the one way of telling a story is just simply to move forward. “Connecting” seems to be just as complicated as it sounds.
For one, no longer do you have a captured audience for them to digest your work in one sitting. The audience can now jump from one part of your story to another. They can now view your documentary on their own time and only the parts that they are interested in. So is there a way for the filmmaker’s view to translate properly into the hyperlinked form?
I think it’s possible but I think the material requires two different approaches resulting in the same ending. There will be path as determined by the author to be the “true vision” for the work. This is exactly as the creator determined it and should be experienced as such. The other path would be the one created by the user as they go along the story on their time and their own interests.
With this in mind I think traditional documentary methods still hold true for two of my proposals. One is the train ride and the other is Woody Allen map of Manhattan but I think I can narrow it down to his film (Manhattan, 1979).
For “Manhattan”, there are various sources for the film such as the Internet Movie Database and Google Maps to determine these locations. For instance, the iconic scene with Woody Allen and Diane Keaton was shot exactly on this spot. As you can see today, there is no park bench. I wonder what this place looks like at sunrise, as depicted in the film.
Then embed this scene,
That should work. It would allow the film to be enjoyed in a different way. Something like that.
As for the train, I’ve downloaded maps and brochures from Amtrak that would get me to the west coast. Sadly I couldn’t find an API for Amtrak but Google Maps does the same anyway.
Based on the railyway system of the US, there are two ways to the west, either through the north or sounth, the tracks don’t exactly cross the US like the highways. Also, it would take two trains. The Lake Shore Limited and the California Zephyr. I think this is where the two paths I mentioned above can be created. One is my own journey on the train that would include, videos, pictures, and posts from along the train that would be the filmmaker’s point of view. The other is where the user follow the train as it went through the track in real time via Google Earth. The user can then see the stops that are marked along the way to get from point A to point B and any historical significance it would have. This would also work along the east coast, but I think riding across the country would be very exciting. Or maybe California Highway 1.
Technologies I think I see in integrating would be:
That’s all I can think of for now.
