Texas Observer by Amal Ahmed 8/26/2019
More than half of FEMA’s flood maps rely on decades-old data. Now, a group of Texas researchers is tackling the problem with a $3 million grant and crowdsourced data. Talk to any scientist long enough, and eventually they’ll bring up an old aphorism: all models are wrong, but some are useful. Even with better data, and more sophisticated tools to collect it, there’s no truly perfect way to capture the dynamic world that we live in.
Two years ago, Texans learned that truth the hard way when Hurricane Harvey hit the Texas coast. The storm was classified as a 500-year storm. But neighborhoods that FEMA’s flood maps never predicted to flood—even in a storm of that size—experienced historic, devastating flooding.
FEMA’s maps calculate the expected risks of a given area to keep people from building in dangerous zones, and to inform residents and business owners if an existing property is in a flood zone. But an analysis by Bloomberg found that many of these maps rely on 40-year-old models based on outdated weather and storm data, and fail to account for changes in land use, like new developments and roads. (Climate change has also altered the strength and speed of hurricanes.) Just weeks after Harvey, the Department of Homeland Security issued a report finding that less than half of FEMA’s maps accurately portrayed current-day flood risks.
In an effort to rectify the problem, FEMA and the Texas General Land Office are partnering on a $3 million, two-year effort to create a new type of floodplain map. Sam Brody, the project’s principal investigator and the director of Texas A&M Galveston’s Center for Texas Beaches and Shores, spoke with the Observer about how to build better and more useful models by incorporating more than just traditional scientific data.
More:
https://www.texasobserver.org/floodplain-maps-are-outdated-this-scientist-wants-to-change-that/