Sunday, January 6, 2008

What Could We Do With 20,000+ "Airborne - Immersive Hotdogs" Over The US Per Day?

I just flew in from Miami - and boy did I have an idea.....

On my way down to Miami to watch Virginia Tech hand Kansas University a victory in the 2008 Orange Bowl, I set my GPS logger up near my window seat and periodically snapped some shots out of the window.



(Click here to view or download the 88 Photo dataset for yourself via Google PicasaWeb.)


(Click here to view the interactive Google Maps version.)


(Or click here to view the dataset directly in Google Earth.)



I was pretty pleased with the results, particularly the shots coming in from the Atlantic Ocean, over Miami Beach and into MIA!

Hmm... sort of like Microsoft's BirdsEye....

And, it got me thinking - could you imagine if instead of 88 shots over 2 hours from one window, I had the ability to record 30 frames per second from 12 lenses 360ยบ around the plane for the full 2 hour flight?

That would give me 1,296,000 georeferenced images worth of data .

Then I started thinking about an NPR piece that I heard back in November that mentioned the scale of the number of flights flown each day across the US.



(Watch a beautiful video by Aaron Koblin depicting US Air Traffic Below)

[youtube=http://www.youtube.com/watch?v=dPv8psZsvIU]

So those 1 million + georeferenced images would come from just one of roughly 20,000 flights flown over the US each day...imagine if each of those flights had a similar system!

I think this would be fairly easy to accomplish if you took something like the Immersive Media Dodeca System and distributed the sensors around the airplane (maybe as simple as splitting the camera in half along the equator and mounting one half on top and one half on the bottom of the plane, or splitting the lenses out all around the plane.)

As far as data processing and storage is concerned, I'm not even sure that capability is possible right now - but imagine all of this data available to update Virtual Earth or Google Earth with near real time immersive imagery from multiple angles and altitudes - all time-stamped so that you could observe changes in construction, population movements, traffic, etc over time.

Would the data be collected when the plane lands, or transmitted via some sort of tether to a central processing station?

Apparently the latter idea isn't so far fetched, considering that Boeing has reportedly demonstrated a satellite video tether system to the FAA which is intended to broadcast in-cabin video for security purposes.

The cost of doing something like this at first would seem to be incredibly high, - the electronics and sensors required currently aren't cheap but if purchased on this scale they may not be too outrageous. However, the platform (commercial airliners) are already flying en masse every day - there is practically no cost associated with the actual flight of the sensors - only in their initial equipment purchase and the infrastructure / storage / processing system.

Would it be worth it to the US to try to organize such an effort?

How much is currently spent on similar data collection for the USGS and other national agencies - could this capability provide more accurate and denser data than traditional satellite based sensor systems?

Then again - I'm having a hard enough time figuring out how to process less than 1 mile of an "Immersive Hotdog" along Manasquan, NJ's boardwalk - this "airborne - immersive hotdog" idea probably needs to go on hold for a while :)