The Global Katrina Effect - Deadline Extended for Paper Submissions!

I just found out that the deadline for papers has been extended to Friday, June 20, 2014. This is a great opportunity to submit papers you have been working whether you are in academia or are a practitioner.

"Since the goal of the symposium is to advance new ways of understanding the impact of Hurricane Katrina through a cross-national comparative examination of case studies,  proposals should adhere to the following structure: 1)  highlighting what happened during Hurricane Katrina regarding a specific subject area; 2) reviewing changes in institutions, procedures or law in the United States as a result of lessons learned from Katrina in this sector and 3) identifying how other countries adapted their emergency management systems/policies post-Katrina and whether these innovative changes might be utilized by the US and other countries.  Interdisciplinary studies are particularly encouraged."


The Global Katrina Effect: An International Research SymposiumCenter for Disaster Research & Education

Millersville University of Pennsylvania, USA

October 1 – 3, 2014

August 2015 will mark the tenth anniversary of Hurricane Katrina, considered to be one of the defining historic events within the emergency management field in the United States.  Accordingly, this anniversary will prompt numerous reflective academic assessments of how this disaster, which struck the Louisiana and Mississippi Gulf Coasts, changed the US emergency management landscape thereafter. Less known, however, is the impact that Hurricane Katrina had on disaster management systems in other countries --over a variety of subject areas ranging from emergency preparedness to coastal management to vulnerable populations to companion animals. To highlight the global lessons drawn from Hurricane Katrina, Millersville University will host an international research symposium on October 1-3, 2014 which will bring together policymakers, practitioners and academics from around the world.  This interdisciplinary gathering will take place over a three-day period during which juried scholarly papers will address how other countries modified their disaster response institutions, practices or policies after the initial American mismanagement of the Katrina crisis came to light.

via Millersville University - Center for Disaster Research & Education.

Really, What is Situation Awareness?

We talk about situation awareness a lot. In fact, it permeates a lot of our central assumptions and decisions.  It also impacts how we strategize about our operations and planning, especially as it relates to information management. So really, what is situation awareness?  How does it relate to information management?

Situation awareness is the primary conceptual tool that disaster personnel use to manage all the information that disasters create. Endsley (1988) describes situation awareness as “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.” According to Endsley (2000), situation awareness is a distinct stage from decision making and the subsequent performance of actions. Endsley (1995, 2000) defines three levels of situation awareness leading to decision making and the performance of actions:

  • Level 1 – Perception of current situation,
  • Level 2 – Comprehension of current situation, and
  • Level 3 – Projection of future status.

Situation Awareness in Dynamic Decision Making

Harrald and Jefferson (2007) add that in emergency management, data interoperability leads to a common operating picture, which then leads to situation awareness. These concepts are intricately linked and build off one another. Endsley (2000) points out that situation awareness “is a state of knowledge about a dynamic environment. This is different than the processes used to achieve that knowledge.”

As situation awareness is achieved, though, employing a decision aid such as John Boyd's OODA loop helps decision makers make the right decisions. Originally developed for military application, the OODA loop has application across a wide range of situations.  It has four central components: 1) Observe, 2) Orient, 3) Decide, 4) Act.


So if our goal is to achieve good situation awareness so we can make better decisions, where do we begin?

Researchers such as King (2005) focus on how knowledge management concepts can be used to categorize disaster information needs of decision makers. Knowledge management has close ties to situation awareness as they both deal with the effective use of information. However, situation awareness uses knowledge management concepts to inform decision makers specifically in highly dynamic and mission critical environments. According to the King (2005), “[k]nowledge management is the systematic process and strategy for finding, capturing, organizing, distilling and presenting data, information, and knowledge for a specific purpose and to serve a specific organization or community.”  He bases his categorization on the needs of decision makers as a whole and finds there are four different types of knowledge types that decision makers seek out:

  1. Situational
  2. Background
  3. Operational
  4. Analytical

This categorization is quite high level, though, and does not fully address the range of decision making information needs. In addition, the ability to find, capture, organize, distill and present data, information and knowledge is quite challenging in practice. King (2005) points out the challenges organizations face:

“Information is constantly changing, comes from a multitude of sources and is often incomplete or contradictory. In some cases, there is an overload of information and, in other cases, there are complete gaps in what we know. Collecting information is often difficult, if not impossible, because of inaccessibility to the affected areas due to natural hazards, insecurity or government restrictions.”

Zhang et al. (2002) adds:

“Relief agencies are slowly developing the infrastructure to undertake effective information and knowledge management. A huge amount of information is collected but not efficiently used. Despite advanced technology achievements, many decision are still taken in emergencies with little information beyond that in people’s heads.”

Information management is inherently complex. Overcoming this complexity is a key challenge for our future as more and more information becomes available, especially in real-time.  So what approach will yield the greatest effort-to-outcome ratio in improving information management?  I think the we still have a ways to go on this, but simply categorizing and typing all available information is also impractical, costly, and causes strategists and technologists to lose focus on important and high value contributions.

So what are your thoughts on situation awareness? Does this jive with your interpretation? What are the challenges you face in practice?

Real-time Social Analytics for Disaster Response and Emergency Management

I had the great pleasure of meeting the SocialAI team in February when they demonstrated their social analytics tool for disaster management to a key audience of practitioners and technology enthusiasts. The team comes from the well-known Georgia Tech Research Institute and are participating in the Humanitarian Technology: Science, Systems, and Global Impact Conference taking place this May 13-15 in Boston, MA. If you are interesting in the subject of humanitarian technology and happen to be in the area, I highly recommend that you attend to check out SocialAI and the host of other tools working in this arena. Their exhibit is titled: Real-time Social Analytics for Disaster Response & Emergency Management.

SocialAI Dashboard - Data fusion of electric grid data and tweet reports from individuals during the 2014 Atlanta #Snowpocalypse. Using the dashboard stakeholders can identify areas where on the map new incidents are occurring e.g. new power outages or where new shelters may be best to open.

The analysis they are doing related to social media in disaster management is quite remarkable. But they have a big mission ahead of them as they attempt to make social network theory and social media analysis relevant for practitioners.  This is a near universal problem this industry faces and SocialAI is one of the key players making progress in the area.

SocialAI Analysis – identifying the most influential tweeters who are discussing Hurricane Sandy by reconstructing the social network graph based their mentions and RTs.

Check out more about their research as well as the team on their website.  As they develop the system more, I know they will be looking for more practitioner feedback.  This is a vital component of the system and it won't be as effective as it could be with your support.

Also, what thoughts do you have on the system?  What kinds of social network information do you seek in your operations?

Social Media and Situational Awareness at a Joint Interagency Field Exploration

This article was first published in the April issue of the IAEM Bulletin and highlighted lessons learned from a Joint Interagency Field Exploration that put technologists and decision makers in the same room to achieve new possibilities.  Mary Jo Flynn was the 1st author.  I supported this article as a 2nd author.  We both participated in the event.  

High impact and high visibility disasters have increasingly revealed the proliferation and widespread use of mobile devices, social media, photos, videos, and other sensory data and channels as information sources. This information can be helpful in planning for, responding to, and recovering from disasters and emergencies. The amount and speed of available information, however, in addition to a lack in ability to identify, verify, aggregate, coordinate, and contextualize information gleaned from social media, leaves data often unused and un-actionable.


To address technology gaps across a variety of disciplines, including information sharing, the U.S. Department of Homeland Security Science and Technology Directorate, in partnership with the Office of the Secretary of Defense and the Naval Postgraduate School, hosts the Joint Interagency Field Exploration (JIFX). Each quarter, JIFX participants utilize different methods of interaction, all of which focus on end user input, which reflects and address the most complex challenges identified by those directly engaged in homeland defense and security. JIFX 2014-2, held at Camp Roberts, Ca., February 10-13, offered participants an opportunity to participate in an experiment looking at the usefulness of social media and data to address agency mission objectives and pre-existing information requirements to achieve enhanced situational awareness and decision support.


Members of the DHS Virtual Social Media Working Group, including individuals from Anaheim [Ca.] Emergency Management, San Francisco Department of Emergency Management, Johnson County [Ks.] Sheriff, New York City Office of Emergency Management, George Washington University, Wright State University, Humanity Road, U.S. Health and Human Services, U.S. Northern Command, the National Guard, and many others participated in a three day event to test how useful information gleaned from social media sources could have been during Hurricane Sandy, if it had been easily available. Several technology companies participated as well, offering their tools for the purposes of testing how to identify, leverage, integrate, and visualize social media and other types of data within an operational environment.


The scenario was based on factual data identified from actual events in Hurricane Sandy, and included several “moves” that spanned the pre-event, onset and response, and recovery phases of the incident and included weather conditions, storm effects on Critical Infrastructure and Key Resources (CIKR), ongoing response efforts, population actions, and social media. Participants and technologists worked together to identify what type of information, if any, could be leveraged from additional sources to inform emergency response decision-making during the event. The group focused on specific information needs from within these various areas, specifically focusing on mission requirements, applicable keywords, potential thresholds to assist with prioritization of available information, the essential elements of the information needed including technical specifics (e.g. detail, format,  update frequency, visualization method, etc.).

As the exercise played out, participants concluded that the data available, which had been stripped of personally identifiable information (PII), was not detailed enough to produce a clear picture of events as they unfolded. Difficulties arose as well, in the discussions between the technologists and the participants (end users) regarding the specific objectives, requirements, and applications of the available technologies. It became apparent that there was a significant disconnect in concept, meaning, and terminology that must be addressed in order to support future technology development.  

Since the exercise plan was already built around a storm scenario, the group decided to switch to a live event, the Nor’easter poised to hit Atlanta, Georgia and the East Coast, on February 12 and 13. The exercise was modified to allow all technologists to perform work in teams while subject matter experts provided information on data needs for decision making and appropriate visualizations.  Subject matter experts provided a real-world link to actual impacted communities through Humanity Road, which officially activated to support response to the event, and other connections established during the exercise.

Lessons Learned

The transition to a live scenario proved informative as technologists attempted to adjust their solutions to the needs of emergency managers in real-time.  One key finding was that technologies need focus on 1) anomaly and change detection and then 2) enable the decision maker to inquire further about other potential impacts or why such a change or event is taking place.  Another key finding was that while automated search may help to build situational awareness, full accounts are still best made by the experts who can contextualize information better and faster than information systems.  Despite decisions still being held by an individual, the technologies at the JIFX provided significantly greater insight into the awareness of the situation when integrated with other data sources and technologies.

Participants identified several additional lessons learned, including:

  • The need to identify information requirements, both individually, and in “packages” (e.g. groups of information that, together, satisfy various questions). there is a need for better categorization and discoverability of available data to ensure potential resources are identified prior to an event;
  • Mission objectives must be pre-identified and defined to align with technical requirements to ensure technology is leveraged effectively.
  • Establishment of baseline monitoring capabilities will be useful in determining the occurrence of events, or anomaly detection.
  • While automation of analysis will help in minimizing the time required to identify useful information, manual input and/or consideration will help to ensure the veracity and applicability of found information.
  • Definition of relationships between multiple information sources, including cascading effects and additional information requirements, will assist in further contextualizing information as it relates to the operational environment at hand.
  • Due to the volume of data available, filtering queries as defined by pre-existing mission and information objectives may prove more useful than filtering all results.


The shift from scenario to live event clearly demonstrated a need for real-time analysis of technologies to accurately determine the usefulness of tools. Additionally, the removal of all PII presented a challenge to government agencies needing crucial life-saving information; additional consideration is necessary in order to best identify how to move forward in trend analysis, ensuring information that is accessible, whether limited by various policies and legal considerations, is used efficiently and effectively.

Attend the Best Conference for Information Systems in Disaster Management

During May 18-21, 2014, Penn State University will be hosting the 11th annual Information Systems for Crisis Response and Management (ISCRAM) Conference.  ISCRAM is an international community promoting research and development, exchange of knowledge and deployment of information systems in the field of crisis management.  The May conference is expecting to draw approximately 300 attendees.

You are encouraged to engage the expert panelists on the following four topics.

  1. Super Typhoon Haiyan: the Information Management Disaster?
  2. Crowdsourcing Crisis Response: The Boston Marathon Bombing
  3. Doing IT Right: Ethical, Legal and Social Issues of IT Supported Emergency Response
  4. Creating a Common Operational Picture (COP) with the Crowd

At last year’s ISCRAM conference, the headline topics were holistic crisis management as well as the need to close the gap between scientists and the practitioners.  It is envisioned that ISCRAM 2014 will advance these discussions as the organization positions itself to take a leadership role to reframe information systems for emergency response and crisis management scenarios.

Is Interoperability Really Needed?

Most people will respond to this question with an emphatic "YES!"  And I would concur with this statement to a large degree.  However, I would argue that saying "yes" is quite an incomplete response.

There are so many facets to interoperability.  Two of the main facets include: 1) how our crisis management structures work together, and 2) how our technical systems for data and information sharing communicate.

But before we get to making these two facets compatible, we must balance the need for information with the availability of that information.  We need to define the types of information we truly need to respond in order to better prioritize progress on interoperability.  We can just say we are going to make everything "interoperable."  In reality, interoperability is quite a large subject and with many many micro-issues.

Often, we first ask the question: "What data/information is available?"  However, to obtain better data and information, we need to start with the question: "What data/information do we need?"  This helps guide design and development toward the most impactful milestones.

As we approach big data, this will become even more important as we have to process and make sense of the massive amounts of information that will become available to us.   Ultimately, we may have the great majority of data we asked for; but if 50% is not useful, we just wasted a lot of critical time and effort sifting through it all.

WiFi as Disaster Aid...via a Balloon!

Yup, you heard that correctly...WiFi via a balloon. This is a very exciting endeavor and one that is definitely within the realm of possibility. And not only is this possible, but it is also quite complementary to my colleague's idea of smartphones as disaster aid.

For the past number of years, the advances we have made in software and hardware have been amazing and quite useful, including helping us to better sharing and analyze information in real-time and across domains and organizations. However, the advances have significantly increased our dependence on internet connectivity.

My interest in WiFi balloons was first piqued when Google first publicized Project Loon. Project Loon helps bring internet to world, especially for 3rd world countries. The project is very much in pilot phase, but they have a couple successful test flights under their belt.

As a quick aside, this project also falls in line with the belief that internet access should be a human right, not a privilege. This debate will likely continue for a while, but for now, there is a huge need in disaster operations for this kind of technology.

In fact, The University of Michigan is working on such a project.  Aaron Ridley and a team of researchers are looking at how high-altitude balloons can be launched within an hour of a disaster and carry WiFi routers to impact zones. According to the University of Michigan:

The balloons would become platforms from which Internet-to-ground signals could be sustained and controlled throughout emergencies. This kind of rapid response and reliable, real-time communications with first responders could mean the difference between life and death for otherwise helpless victims.

This is quite promising as it has the potential to build capacity and redundancy for one of our critical dependencies...Internet access.

How do you see this being deployed in a disaster?

UPDATED: Looking for Disaster & Emergency Management Journals? Look no further!

January 24, 2014: Since posting, I have received comments on how valuable the list is as well how one can contribute other journals. I have uploaded the list to Google Docs and created a form to contribute additional journals.  

Back in May 2013, Professor Ali Asgary from York University in Canada wrote a great article in the IAEM Bulletin. He discussed the state of academic and research journals in disaster and emergency management. His finding were based on his own research with his master's student that produced a list of over 125 core journals. They categorized the journals by:

  1. Business Continuity
  2. Disaster and Emergency Management
  3. Hazard
  4. Risk

  Here are some of the key findings:

From Hazard Science to Disaster and Emergency Management. Core DEM journals can be classified into these categories: risk and risk management (37.6%); disaster and emergency management (28.8%); hazard science and mitigation (28%); and business continuity (5.6%).

Status and Format. Of the 125 core EM journals, 106 journals are currently active. About 21 journals are published online only, and about 104 journals appear in print editions only or are published in both print and online formats. The first online only DEM journals appeared in 1997, with an increasing number emerging in recent years.

Publishers and Country of Publication. About 80 publishers from 189 countries are involved in the publication of the core DEM journals. However, as with any other discipline, major publishers, such as Rutledge, Inderscience Publishers, I.G.I. Global, Emerald Group Publishing Ltd., Elsevier Ltd., and Wiley-Blackwell Publishing Ltd., publish the majority of EM journals. Most such journals are published in the United Kingdom (45), the United States

Growth and Change. The first DEM-related journals started in 1957, with the publication of a risk-related journal called the Journal of Risk and Insurance. This trend continued with the publication of a hazard-related journal in 1964.

Overall, he found significant growth in journals starting in the 1990s. In that time, journal focus also shifted from being mainly hazard-specific to more disaster and emergency management related.

The Crisis Leader [NEW BOOK]

I am very pleased to announce a great book by a great colleague.  As of today, The Crisis Leader: The Art of Leadership in Times of Crisis is available for purchase from Amazon.

It is a book that explores the very tough, but very real problem of leadership in crisis.  This leadership skill is distinct from other types of leadership and Gisli tackles to the subjects with a direct approach and simple writing.  He interweaves stories from his vast personal experience to highlight the complexities of the subject.

The Crisis Leader: The Art of Leadership in Times of Crisis

Gisli draws on his vast experience as a crisis leader having worked in numerous crisis situations.  Here is his full bio:

Gisli Olafsson has been the Emergency Response Director of NetHope since November 2010. In his current role he is responsible for emergency preparedness and emergency response activities related to information and communication technology (ICT) within the fortyone NetHope member organizations.

Prior to that role he worked as a Disaster Management - Technical Advisor for Microsoft Corporation from September 2007 to October 2010. In that capacity, Gisli was responsible for providing guidance to international organizations, such as UN, IFRC, World Bank, Commonwealth, USAID and NATO, on the effective use of ICT to enhance response to natural disasters.

Gisli has over 15 years of experience in the field of disaster management and is an active member of the United Nations Disaster Assessment and Coordination (UNDAC) team, a team of experienced disaster managers which are on stand-by to deploy anywhere in the world on a 6 hour notice to coordinate the first response of the international community to disasters on behalf of the UN Office for Coordination of Humanitarian Affairs (OCHA).

Gisli was also a team leader for Iceland's international Urban Search and Rescue team (ICE-SAR) which is classified as a medium USAR team by the UN. Gisli was the team leader for ICE-SAR in the Haiti Earthquake in 2010. Gisli has years of experience as an incident commander and served as part of Iceland's National Search and Rescue Command for years. Gisli was a lead member of King County's Emergency Operation Centre's Support team while living in Seattle and took part in coordinating over 500 disaster management and SAR incidents.

In recent years Gisli has participated in disaster field missions in connections with floods in Ghana (2007), Cyclone Nargis in Myanmar (2008), Hurricane Ike in Texas (2008), Sichuan Earthquake (2008), Pandemic Outbreak (2009), West Sumatra Earthquake (2009), Haiti Earthquake (2010), Japan Earthquake/Tsunami (2011),Horn of Africa famine (2011), and Typhoons Bopha (2012) and Haiyan (2013) in the Philippines.

2013-2014 Business Continuity Management Benchmarking Study

I just received word that the 2013-2014 Continuity Insights and KPMG LLC Global Business Continuity Management (BCM) Benchmarking Study has just been released for participation.  This is the premier benchmarking study in the industry.

You can participate now by following this link.  The study will close February 21, 2014.

According to Continuity Insights:

All study participants will receive upon request a complimentary copy of the study results:  valuable information to enhance your program and benchmark your organization against various industry metrics. To view a copy of the past 2011-2012 BCM Study, please visit:

The study digs deeps into today's most critical business continuity challenges such as BCM performance measurements; adoption and implementation of global regulations and standards; budget status/development/allocation; supply chain issues; and a great deal more!

NEW FEMA Social Media Jobs

FEMA has just taken yet another giant leap forward in progressing its social media presence.  In the coming month, FEMA will be hiring 9 new public affairs specialists to focus solely on social media.   (Thank you Kim Stephens for the lead!) These are brand new positions and will have a huge role in shaping the future of social media at FEMA.  In fact, this a trait that is desired.  Jason Lindsmithe, Social Media & Mobile Lead at FEMA states:

We’re looking for people willing to push the envelope, be creative, and set the gold standard for digital engagement before/during/after disasters.

They will work on disaster-related projects and priorities, so they’ll be fast-paced and work on highly visible initiatives.

The positions have 2-year terms, with the possibility for renewal following the two years, depending on available funding & need.

The positions below will expire on USAJobs on Tuesday, November 14.

  1. Public Affairs Specialist Social Content (CORE) GS-1035-9/11 (Link)
  2. Public Affairs Specialist Digital Engagement Mobile Platform (CORE) GS-1035-9/11 (Link)
  3. Digital Engagement Training Specialist (CORE) GS-1089-11/12 (Link)
  4. Public Affairs Specialist Digital Engagement-Multilingual (CORE) GS-7-9 (Link)
  5. Writer (CORE) GS-1089-9/11 (Link)

Other positions coming soon:

IT Specialist (CORE) Digital Engagement Programmer GS-2210-11

The incumbent is charged with enhancing functionality of the agency’s existing and new digital engagement channels to better reach those impacted by a disaster or emergency.

Public Affairs Specialist Digital Engagement Web Designer (CORE) GS-1035-9/11

The incumbent is charged with creating visually appealing digital products and websites as an important part of telling FEMA’s story, communicating critical safety and recovery information, and quickly impacting people who may be in the midst of an emergency.

Public Affairs Specialist Digital Engagement Web Content (CORE) GS-1035-9/11

The incumbent is charged with developing, implementing and evaluating digital communication plans and tools that contribute to improving FEMA communications operations and objectives through the effective use web tools and platforms.

Public Affairs Specialist Digital Engagement Social Listening (CORE) GS-1035-11/12

The incumbent is charged with effectively listening through social media channels to provide improved situational awareness during disasters, result in better messaging from ESF 15 during crises, and increase information sharing among FEMA its disaster response partners.

How You Can Help 'Crowdsource' Typhoon Yolanda Response (UPDATED)

Update. This blog post has been updated since its original posting to provide additional background on MicroMappers' two primary initiatives (TweetClicker and ImageClicker) and provide additional explanation.  

Update 2. As of 9am Eastern on 11/13, no more Tweets and images are being added to the applications. However, you can still view results on the crisis map.

Typhoon Yolanda hit the Philippines this past Friday as one of the largest and most powerful storms ever recorded on earth. Many initiatives are underway to support response efforts. However, if you would like to support response efforts with your time and energy rather than donating, MicroMappers, at the request of the United Nations Office of Humanitarian Assistance (UN OCHA), has stood up two applications to help quickly identify ("tag") information from tweets and images relevant to disaster responders.

TweetClicker and ImageClicker are both simple to use "microtasking" applications to verify Tweets and images gathered from social media. The goal is to leverage the "crowd" to help sift through the massive amounts of data collected. Each application requires no technical expertise and can even be used on your computer or mobile device. The application runs you through  a simple tutorial before beginning. Each message takes about 3 seconds to review and will get reviewed by two other people, so your selections will be validated by others as well.

NOTE: If you encounter a "100% complete" notice when navigating to the pages, keep checking back every hour. The applications are adding new messages and images to verify continuously. 

The results of this effort are being displayed on a live crisis map supported by the StandbyTaskForce and GISCorps, which are both members of the Digital Humanitarian Network. Each of these groups are network of people and organizations with missions to support the formal and informal response.

In the response to Hurricane Yolanda/Haiyan, they are digitally skilled volunteers acting as force multipliers. Conceptually, they are similar to Red Cross's Digital Operations Center that leverages digital volunteers to support response efforts. However, describing these organizations and how they operate is a separate post.

Leading this effort, though, is MicroMappers.  The initiative (loosely defined) is a partnership between QRCI, CrowdCrafting, and UN OCHA  and is led by a number of industry technologists including Patrick Meier, Ji Lucas, Luis, Daniel, Ariba Jahan, Christine Jackson, and Daniel Lombrana Gonzalez.

For more background and continuous updates on Typhoon Yolanda/Haiyan response efforts using TweetClicker and ImageClicker, check out this blog post.

Why is Crisis Mapping So Popular?

I was recently asked this question by a colleague.  I didn't have a full answer at the ready, so I thought about it some more.

Crisis mapping is usually conducted with the aim of producing "maps" that have key geographic data relevant to a response.  According to Wikipedia,

Crisis mapping is the real-time gathering, display and analysis of data during a crisis, usually a natural disaster or social/political conflict (violence, elections, etc.)."

So why is crisis mapping so popular?To understand the popularity, we have to look at when "mapping" was first popularized in the Haiti Earthquake on January 12, 2010.  Prior to Haiti, crisis mapping did exist, but primarily with the resources and motivations of National Geographic only.

To support the response effort, a group of "mappers" no where near the earthquake used an open source tool called Ushahidi to begin mapping tweets and other information collected from the Internet to provide better situational awareness.  At one point, Craig Fugate, the Adminstrator of FEMA, praised the Haiti Crisis map as "the most comprehensive and up-to-date map available."

The "crisismappers," as they became known after, though, were just a group of unaffiliated and spontaneous volunteers.  Most had no prior mapping or GIS experience.  They worked independent of any one authority to produce maps that would be useful to on-the-ground responders and coordinators.

Ushahidi was designed around the needs of a consumer and a problem, not a list of technical requirements given to them by an organization.  As a result, the software was developed for non-technical people to use.  This enabled people not formally trained in mapping and GIS to support mapping efforts and launched a slew of publicity for Ushahidi as the go-to crisis mapping tool.

Of course, as with every platform, each has its limitations.  Still, Ushahidi has worked hard in recent years to improve the software and even released a hosted version called CrowdMap.  Similarly, other tools such as MapBox have devoted considerable effort to developing easy-to-use mapping tools.

However, easy-to-use tools while important, are not the only reasons for the popularity of crisis mapping.


This "consumerization" of technology is now enabling mapping to shift from an EOC support function to a skill of the modern emergency manager.  Without the support of a technical specialist, emergency managers can begin to answer their own questions faster and easier through a response.  They can get further detailed in their analysis and research to better understand the situation before them.

This was a critical factor in allowing the crisis mappers to utilize Ushahidi during the Haiti response.  They were able to easily adjust their work based on the expanding needs of on-the-ground responders without much technical knowledge and support.

Consumer-based technologies help reduce interdependencies, add efficiencies, and enable emergency managers and responders at all levels of the response to take more ownership of their functional area.  Emergency managers get to focus on their domain and answer their own questions as the response progress while the GIS specialist is freed up to work on more complex geo-spatial needs applicable to a broader audience.  Pretty soon, there will be no need for a GIS specialist as everyone will be a GIS specialist!  The skill is becoming commoditized and ubiqitous.

Availability of Data

Getting data from multiple sources is becoming easier and easier as governments and organizations devote more resources to "freeing" data from their closed, antiquated and locked databases.  The shift in thinking has moved from protecting all data from outsiders to recognizing the value of certain shared data across different organizations.  In the case of Haiti, the crisis mappers were able to pull public data via social media and a special texting shortcode that had implied consent.  However, a lot of great data still exists in the silos of organizations.

In early 2011,  hired a Chief Digital Officer to help navigate the complex polices that have prevented such access to data before.  To help disseminate data, NYC launched an Open Data Portal where you can easily access flood zone, shelter data and fire station locations in a variety of formats.  Better yet, you can actually bring this data into your own systems mash up against other data to produce more value-oriented analysis and solutions.  Prior data and real-time data need not be mutually exclusive anymore.

The more data that is available, the more you can do.  In creating your risk profile, you can easily see and map which of your buildings or offices are in designated flood zones.  Have to discharge patients before, during or after a disaster?  Check to see if they may be in a designated flood zone prior to discharge so alternative arrangements can be made.

Adoption Costs

I have always said that technology should be intuitive for the person who knows his or her job well.  This helps reduce costs in two ways:  training and efficiency.  If a tool is intuitive, less time and money needs to be spent on learning how to use this tool.  Additionally, the more the tool is intuitive and matches the needs of the functional area, the easier it is for the designated person to get his or her job done faster and with less errors.

Ushahidi was designed with quick adoption in mind and enabled the crisis mappers to quickly adopt it as their tool of choice.  Little training was needed on the tool itself and the mappers were able to focus more on how to get the data into the system for added value and insights.  The simplicity of the system enabled them to work quickly (as humanly possible) without fretting over the large and expansive feature sets and options that bog down so many tools.  In a way, Ushahdi was an "expert" system that focused on the best practices in crowdsourcing rather than giving the user all the options in the world.


Crisis mapping, while the popular concept of the day, is well on its well to becoming a defacto skill in the industry.  The lessons from crisis mapping are still being extracted, but the rise in popularity has started giving us a blue print for what other technologies should embrace.

We are beginning to better understand how technology is helping us do our jobs better.  The easier that tools are and that do their designated function well, the better off we will be in the future as more data becomes available.

What are Decisions Makers' Needs in Sudden Onset Disasters?

One of the greatest problems we face in disaster management is understanding the type and breadth of decisions that we make during a disaster.

So much goes into decision making that we need to devote significant research and effort to putting this skill in a better perspective so that better tools and approaches can be developed. Long gone should be the days of making decision "off the cuff." Decisions, despite their impending urgency and seriousness, should be as purposeful, collaborative, and as science-based as possible.

Andrej Verity, a disaster responder and Information Management Officer for UN-OCHA just released a report from a workshop on Field-Based Decision Makers' Information Needs. Here is a link to the full report.  The main authors included leading researchers Erica Gralla (GWU), Jarrod Goentzel (MIT), and Bartel Van De Walle (Tilburg). Check out Andrej's great introductory post on Demystifying decisions makers' needs in sudden onset disasters.

The report focuses heavily on the decision-makers' perspective.  It asked what decisions are typically made and then separately, what are the information needs in sudden onset disasters? Ultimately, the decisions and information needs will be linked in future research.

One  goal  of  this  workshop  was  to  help  Volunteer  and  Technical  Communities  (VTC)  to  understand  the  information  field  decision-‐makers  require  to  make  the  best  possible  decisions.  These  results  lay  a  foundation  for  this  understanding,  by  providing  (1)  a  framework  and  set  of  information  required  by  field-‐based  decision-‐makers,  (2)  categories  and  types  of  decisions  made  by  decision-‐makers,  and  (3)  a  large  set  of  brainstormed  decisions  from  workshop  participants.  VTCs  and  others  seeking  to  support  humanitarian  action  by  providing  and  organizing  information  can  utilize  these  results  to  (a)  prioritize  their  efforts  toward  important  information,  and  (b)  organize  their  information  in  a  manner  intuitive  and  useful  to  humanitarian  decision-‐makers

Check out pages 7-8 for great pictorials of the following findings regarding decisions and information requirements:

Decision dimensions and categories are broken down by timeframe, scope, locus/authority of decision-making, criticality, frequency/duration of decision, information gap (confidence), and function.

Information requirements are broken down by context and scope, humanitarian needs, responder requirements, meta information, capacity and response planning, operational situation, coordination and institutional structures, and looking forward.

Does this resonate with your work?  Why or why not?

Getting Started on a Emergency Management/Business Continuity Program

The disaster domain is huge. The level of detail and specificity to which you can get is almost infinite. As such, it can be an overwhelming experience for businesses and nonprofits to get started with preparing their organizations for disasters.

In response to an email I just got from a former MPA classmate, I wanted to share some helpful thoughts on how to get started.

The Actions to Take

When discussing this topic, there are four main actions that organizations can take:

  1. Prepare for a Disaster (through planning, training, exercise and equipment)
  2. Plan for Response/Continuity of Operations (responding in the moment/maintaining operations, if possible)
  3. Plan for Recovery (getting back to normal)
  4. Mitigate Impact (stop things from happening in the first place)

Implementing into the Organization

There are many approaches and models to implement these actions (think program management vs. project management). However, the process typically starts with leadership forming a disaster committee of some sort to begin addressing the organization's disaster needs and corrective actions.  The committee then establishes a path forward.

Typical agendas are a variation of the following:

  1. Identify Risks and Gaps
  2. Develop Plan(s) to Address Risks and Gaps (keeping in mind the four actions mentioned above)
  3. Train and Exercise on Those Plans and Purchase Required Tools/Equipment
  4. Redo Steps 1-3 annually (or at designated intervals).

Given the typical resource constrained environment of organizations, there is a lot of potential to address "low hanging fruit" once risk and gaps are identified.  This is not perfect as the approach should be as comprehensive as possible, but it is helpful nonetheless.

The important thing is to not fall into a false sense of security because you have only addressed some of the risk and gaps.  The coordination of effort and understanding your strengths and weakness is vital to a successful disaster management program.

High Value Resources

Here are a few high value resources on what nonprofits can begin to do. Grant making institutions should consider baking some of these principles into their grant requirements.

Domain Headings

If you are looking to do more research in this area, especially as your disaster management program matures, you should look for resources in the following domains:

Getting Started

As a starting point, I highly recommend the following priorities:

  1. Develop a disaster committee led by someone willing and able to champion the effort
  2. Decide if it is best to shut down, continue operations at full or reduced scale, and/or respond to the disaster (i.e., support the community)?  This will help clarify how detailed the planning should be for all scenarios.
  3. Identify 3 targets for the next year (i.e., establish committee, develop a program plan, develop a plan)

It is easy to get overwhelmed.  Focus on establishing realistic goals and moving forward.  Any forward movement is better than no movement at all.