As Katherine Maher pointed out on Twitter, no one gets credit when contingency plans work. And it is truly amazing how much government-citizen information collaboration has evolved--not to mention the growth of data journalism even in the most traditional news outlets. The average citizen had a wealth of accurate (and useful) hurricane information to choose from. There was also plenty of data to be used for those seeking to tinker, and very useful Twitter/Flickr/YouTube overlays on Google Earth interfaces. Anyone unlucky enough to turn on their television was assaulted by an overwhelming wave of hysteria and action shots of news anchors bravely intoning into the camera that it was windy and raining--an astonishing revelation, undoubtedly, to the millions told to evacuate.
Alex Howard of O'Reilly Radar has a very helpful list of Hurricane Irene tracking links from both government,media, and crowdsourced sources. Among the highlights:
- New York City government datamine with lots of Irene-related geographic data.
- Google Earth kmz files for New York City hurricane evacuation zones.
- Maryland state government Google Earth-based iReport for hurricane damage reporting.
- New York Times hurricane tracking map feeding off National Weather Service data.
- Hurricane Irene cleanup map using Ushahidi interface.
- Geographic information system (GIS) and tweet mashup Florida GATOR with lots of downloadable files.
- Google's hurricane GIS/weather data aggregator.
My personal favorite was ESRI's full screen mega-aggregator, built on ArcGIS API for Flex and running on an ArcGIS server. It combined data from the National Hurricane Center, Open Street Map, Telvent, and allowed you to look at layered Flickr pictures, Tweets, and YouTube videos on top of the usual amount of hurricane data. Incidentally, ESRI has entered into a strategic alliance with MetaCarta, which Alex Olesker wrote about in May after attending the Department of Defense Intelligence Information Systems (DoDIISS) conference. MetaCarta's large database of locations, Olesker notes, is seven times the size of the National Geospatial-Intelligence Agency (NGIA)'s gazateer, making it a natural tool for geospatial analysis. It goes without saying that government Twitter accounts have also been a source of timely and accurate information. A wealth of government agencies, large and small, used Twitter and Facebook to spread the word before, during, and after the high point of the hurricane.
The Federal Emergency Management Agency (FEMA) also developed a host of mobile applications for hurricane preparedness and evacuation. Granted, the usual amount of Twitter rumor and innuendo prevailed, but the strong presence of "curated" information from the government, helped counteract information overload. I myself had to completely ignore my usual Twitter and Facebook timelines and focus exclusively on the applications listed above in order to avoid drowning in data. While government resources at times strained under the weight of unprecedented traffic, performance on the whole was very strong and reflected an internalization of what big data and social media pioneers have been urging for a long time.
Latest posts by AdamElkus (see all)
- MYCIN, Watson, and AI History - August 28, 2014
- Computers and History: Beyond Science Fiction - August 26, 2014
- Encyclopedia Dramatica And The Case Of The Satoshi Paradox - March 17, 2014