A far cry from what I made for the midterms, I think this would be better on another surface other than my computer screen.
So I’ve been able to parse the weather API of Yahoo and the AP headline feeds for news. Using two XML parsing codes is a bit challenging. There’s something in the flickr API that isn’t jist working for me so each image is manually assigned to each weather condition set by Yahoo.
This was the original weather midterm project with no data and just computer drawn images.
This is the raw AP headline feed experiment that I merged with my earlier code.
I think that this information lives outside of our mobile phones and screen but instead it should be around us. I had a last minute inspiration on the subway ride home that I wish this was projected on the window of the train so I know what’s going on above ground.
I wanted to project this on a surface with a kinect camera looking at it at the same time where it would give the false impression of gesture control. But doing certain gestures on the projected surface, it would either refresh the news data or change the weather location.
I think I’ll work on this some more during the break. Code to be posted soon.