How Service Design Helped FEMA Prepare For Disasters

Steven J. Slater
4 min readOct 28, 2022

Hurricane Andrew ripped across southern Florida thirty years ago (1982) in what was then the most destructive hurricane to hit the southernmost part of the United States — a Category 5 with sustained winds of 174 mph — only to be surpassed by Hurricane Irma 25 years later . . . and now Hurricane Ian.

Andrew destroyed more than 63,500 houses, damaged more than 124,000 others, and caused $27.3 billion in damage (equivalent to $53 billion in 2021) https://en.wikipedia.org/wiki/Hurricane_Andrew. In comparison, Ian is on track to surpass the damage and the resulting cost: yet still another difference this time around is emergency response. The Federal Emergency Management Agency (FEMA) was ill-prepared for Andrew, drawing criticism that reached Congress and earned FEMA officials a public grilling.

Damage left by Hurricane Ian.

Congress has oversight of FEMA along with the other executive departments and agencies. In order for the federal government to step in and assist with emergency response requires state governors submit a request of the president.

After Andrew, FEMA’s director assembled a team of independent crisis response professionals who gathered at a deserted government outpost, Mt. Weather, Virginia, approximately 50 miles from the nation’s capital. In this remote location, the crisis response communications team hunkered down for two weeks devising plans for how FEMA would respond in future disasters. Most of the first week was spent devising plans that would ultimately become a process FEMA would follow, and the following week was dedicated to testing and prototyping different proposed plans.

Service Prototyping

Service prototyping involves testing visual or physical replications of a service. Tests can be conducted on an entire system or distinct parts. For service designers, prototyping is used to:

  • Validate the requirements of a service
  • Test any concerns
  • Uncover points of failure
  • Improve upon the design
  • Test specific features of a service

For testing, the team was split into groups, each with a plan to test. FEMA staff orchestrated overall and immediate-response scenarios for the team to respond. The measure of success for each team was to provide residents in need with access to food, shelter and financial aid.

The prototyping sessions were quite lively as FEMA staff would interrupt the teams’ work with fictitious storm updates called inserts. These matched with what occurs in reality with live storms and their impact. An insert example would unfold like this: a live tornado has hit the ground (location) uprooting trees and toppling power lines. Moreover, all access to the community has been cut off. The groups would then apply their plans to meet the insert. FEMA officials assessed how each plan fared against the scenarios.

The tests took on average three hours starting at dawn. After lunchtime, FEMA and the group gathered for assessments, with rigor around estimated times of field offices, based on logistics, including times for transportation with materials availability and their location thrown in the mix. Additionally, questions surfaced for which people skills were required to satisfy demands.

Prototyping Techniques Used

  • Parallel: The tests were conducted in ‘parallel,’ with the same scenario applied to the various plans so each test could be compared to one another. Parallel testing also speeds testing. Otherwise testing a number of plans would have been been in serial, testing one than the other.
  • Refinement: A technique based on using prior test results to improve subsequent concepts. For the crisis communications team, results from the first set of tests led to better plans for subsequent testing: from one test to another.
  • Iteration: Iteration involves incremental, situational improvements during the course of a single test. With the FEMA tests, each team could improve upon a plan as the storm scenario unfolded.
  • Active Learning: The active learning phase is when testing occurs in the actual service environment. Several weeks after leaving Mount Weather, the crisis communications team was invited to test the chosen plan for an actual weather response for a tornado in Arkansas.

Less than a month out, an actual storm travelled up the mid-section of the U.S., in what’s known as Tornado Alley, spawning a series of tornados that flatted entire communities. Putting the devastation to people and property aside, the storm turned out to be a perfect opportunity to test the chosen plan.

Where the storm had hit, only bare earth remained. One resident on a tour of his former hometown was unable to pinpoint where Main Street once was. All the buildings and landmarks, including paved streets, had disappeared.

In the wake of the tornado, FEMA invited the crisis communications team to trial their best response plan from the Mount Weather exercises, allowing for Active Learning. The planners evaluated situational responses to the disaster, including resident impact and results and whether the physical requirements were practical. With some further refinement, a plan was adopted for agency guidelines for disaster communications.

In the long run, the guidelines helped improve FEMA’s reputation. But most importantly, residents impacted by disasters could receive aid and comfort much more rapidly and efficiently.

--

--

Steven J. Slater

Steven J. Slater, a service designer, is co-founder of International Service Design Institute www.internationalservicedesigninstitute.com