Tuesday, January 19, 2010

Utilize NGA Geoprocessing Tasks For Haiti in Your Applications

I was very impressed last week with the speed in which the National Geospatial-Intelligence Agency's eGEOINT Management Office made some useful web-based GIS tools available for first responders to the Haitian Earthquake.

The "DemoBase Haiti" site unfortunately still only works in Internet Explorer, which I've communicated to the agency is DEFINITELY NOT the browser of choice for pretty much anyone I know that is responding to the earthquake with their geospatial skills.

Go to a CrisisCamp or OSM Mapping Party and count the default IE users on one hand....they probably just got a laptop with Windows and haven't had a chance to install Firefox or Chrome or to wipe the hard drive and install Ubuntu :)

However, IE remains the security-threat cesspool of choice for the US Government computers so I guess most development STILL gears itself with IE as a baseline.

Well, I thought it would be useful to expose the really powerful pieces of the DemoBase tool, the Geoprocessing Tasks, to anyone that wants to call on the NGA / NRL servers in their web-based applications.

The DemoBase tool currently has one very nice GP task, a Zonal Statistics tool that allows you to draw a polygon anywhere in Haiti and get back a Population estimate.....as accurately as the ESRI Zonal Statistics tool and the NGA data is anyway.....like I said, it is an estimate.

But, it could be incredibly helpful for those on the ground to be able to view recent imagery in an application and then just digitize a polygon on a city block of rubble and be able to estimate the population of that block.

I'm not a full time ESRI JavaScript API developer, but I did get some help from David Spriggs at ESRI to boil down a process for sending a simple polygon to the ArcGIS Server and receive a population estimate in return - I hope it is helpful for anyone trying to add some analytic capability to their Haitian support efforts.

I also hope that a more widespread use of the geoprocessing tasks will show the agency how powerful exposing these services can be and they'll continue to offer more geoprocessing tasks to their current offering - lord knows they have the data to make some very useful and interesting applications....

In my example, I simply create a polygon (a rectangle) out of an array of coordinate pairs - but you should be able to adapt the functions to any GeoJSON or other polygons you might have in your application.

My code will simply initialize and send the polygon through to the Geoprocessing Task and then display the result.

Here is the code

Friday, January 15, 2010

Haitian Earthquake Emphasizes Danger of a Split Geo Community

My career has afforded me the opportunity to be part of what I believe to be a wonderful and generous community; the world geospatial community. A typically happy group of geo-nerds armed with laptops, gps enabled gadgets, and a strong foundation of thinking spatially.

Now, this community certainly has some big business motives behind it, but whenever there is a disaster, or crisis like we are seeing now in Haiti, this community comes together and throws everything it has to offer to help. It energizes me to do what I can to help in these times of need and dedicate myself to applying my trade to the cause. And as a Google employee, I'm blessed to have the full support of my management to work full time when necessary on these events.

While the first few hours of this particular disaster were frustrating as I watched the machine slowly gear up, I am blown away at the response we've pulled together - we're actually learning from these events and each one seems to get a little easier to manage - even if the scale of the disasters always seems to increase.

I sat on an early conference call with representatives of all the major GIS vendors, first responders, geo-nerds, NGOs, govies, and the media where everyone brought what they could do to the table and people teamed up to go do what they do best together.

For example, I watched Google, DigitalGlobe, and GeoEye all work together to get stunning imagery collected, processed, and published FREE to the international community to help a wide array of aid workers and first responders within 24 hours.

I watched the National Geospatial Intelligence Agency, the US State Department, and other government agencies get critical and informative data and applications out the door and into the hands of people that needed them within mere hours of the disaster - a far improvement from the Katrina days!

Perhaps most impressive has been the response that Mikel blogged about to the utter lack of vector data that the Geo Community had access to just after the earthquake.

Stealing his images, just look at the difference in OpenStreetMap Port-au-Prince in just a few days of the Geo Community swarming and finding old CIA library maps, public domain maps, etc and the new imagery released by the commercial satellite providers.

OSM just after the Earthquake

OSM Today

Now, that transformation is wonderful - it is astounding - but it isn't complete because there is a Split in the Geo Community that isn't being well addressed.

OpenStreetMap is not the only community data collection platform - Google also has MapMaker which has similar tools and goals, and has done a very great job of expanding Google Maps to areas of the world where data was not traditionally available. If I lived in an area that previously had blank Google Maps coverage, I was given the tools to fix that problem and many users around the world have been happy to work on maps of their own area so they can enjoy Google Maps, Directions, etc.

I thought it was great that in addition to the countries from which Google already allows non-profits to download MapMaker data, that Google added the data from Haiti and is now allowing any non-profit to download and use Google's data to help during this crisis.

But OSM and MapMaker aren't talking and I think it is a big problem - if you want to help rescue efforts in Haiti where do you go to digitize? OSM? MapMaker?

How can 2 projects be expected to be in synch? Which is more "correct"? Which is more current?

This split means that these questions have to be asked by first responders, and by those working to create products for them.

"Is that road up there passable?" "Does it really exist?"

It means that the Geo Community is responsible for an extra decision between a first responder and a VICTIM.

As it stands right now, even though the MapMaker data is free for non-profit use, projects like OSM can't use the data because there are commercial uses for OSM and the data belongs to Google, not OSM.

These are the old fights of GIS data; these are Navteq and TeleAtlas bugaboos IMHO, not what I expect to see today!

The differences are pretty glaring between OSM and MapMaker in some cases - take a look at the data I downloaded from both over Port-au-Prince.

The data is similar, but different, and needs to be conflated. Where that conflation happens, how it happens, I don't know - but I do know that we need to do something to fix this split before it gets people hurt.

That said, it is good to have 2 or more different projects; it forces competition in the tools, each project has different goals and metrics of success, and it probably ultimately means more community contribution as different groups migrate to different platforms  - all adding to the cumulative Geo Community base data.

However, the data ultimately has to be conflated somewhere - and I urge OSM and MapMaker to work more closely with each other and build some sort of cross platform utility that lets users share edits and co-create data.