Data for city resilience: tackling the water crisis in Flint, MI

Victor Sette Gripp
Civic Analytics & Urban Intelligence
3 min readOct 30, 2016
Tap water in Flint’s hospital on October 16. Joyca Zhu/Flint Water Study

The water crisis in Flint, unleashed after Virginia Tech researchers helped to prove that the water arriving to some homes in the city were technically a toxic poison, is an example of extremely elevated risk brought to communities by a series of institutional failures. That is something that governments should always keep in mind: institutions can fail, and when they do, lots of lives are put in risk.

As cities tend to be more densely populated, these systemic risks have a tendency to naturally increase. Our communities might become more vulnerable not only to extreme weather events, but also to disease outbreaks, terrorist attacks and, as in Flint, environmental hazards. Mitigating such risks and building cities’ resilience is another piece of the puzzle of public resources allocation. Technology and data analytics will certainly not bring straightforward solutions to all the questions that need to be answered on that regard, but are definitely crucial to keep the discussions at an evidence-based level instead of letting it gravitate towards political demagogy.

Flint’s sad example illustrates perfectly how important continuous data monitoring is to prevent major institutional failures. But besides that, it is a clear situation where the combined analysis of all types of data available can bring some light to an otherwise completely obscure problem for a city to solve.

Professors Jacob Abernethy and Eric Schwartz along with PhD students from University of Michigan have done some heavy-hitting analyses on Flint’s water distribution system in a study that showcases the value of the information generated by “big data” analytics for cities.

Their main findings include that “lead contamination varies widely across homes and is highly scattered around Flint, but it is surprisingly predictable”. Besides that, “Flint’s lead pipe records are spotty and noisy, but statistical methods can significantly fill the gap”.

Based on our statistical models, researchers from University of Michigan could display locations estimated to be at high risk of lead contamination. Credit: PhD students Guangsha Shi, Jared Webb, and others at UM.

This model has been helping the city to allocate its resources on the areas at higher risk and was incorporated into a mobile application, that allows Flint residents to learn of their home’s risk level.

Of course models have limitations and there will always be the question of “how much is an acceptable risk level”. That somewhat subjective decision will always be part of the problem and it will be up to each community along with its policy makers to agree upon this more subjective aspect of risk management. Nevertheless, as professors Jacob Abernethy and Eric Schwartz point out for the specific case of Flint, “data and statistical tools can help greatly reduce risks at much lower cost, and a data-oriented understanding of the problems in Flint can guide efforts to address lead concerns in other regions as well”.

I would claim that is the case probably for many (if not all) aspects of building more resilient cities. Which is not to say we are anywhere close to tackling that big challenge yet, it just to reinforce that the “data-oriented understanding of the problems” seems to be the most promising path (if not the only feasible one) to get closer to that. Thus, it should be treated by local authorities as such.

--

--