Preventing flood hazards from becoming horrific disasters

Figure 1. Bridges were ravaged by the raging river, as the Chaurabari glacial lake burst on the June 16, 2013. If an efficient flood forecasting system had existed, the fatalities could have been minimized by timely evacuation. Source: https://www.indiatoday.in/india/north/story/uttarakhand-floods-environmentalists-blame-it-on-centre-state-govt-for-man-made-disaster-167419-2013-06-20.

I was a Master’s student, studying environmental science in Dehradun, when the horrific flash floods of 2013 wiped out entire villages in the state of Uttarakhand (north India). We were locked into our hostels, which didn’t have access to food for a few days, while water in the drain-sized channel behind our university gushed on like a full-blown river. We woke up to watch the news every morning, thinly veiled fear on our faces, silently praying for the relentless rain to stop. Three days later, when it was all over, my friends and I volunteered to help the survivors, who were fast pouring into Doon from all the hill stations. The memory of the dead, and the missing whom we could not possibly help, haunts me still.

I knew then that the technology to prevent such disasters exists, where we are lacking is really in their on-ground application. They say hindsight is always 20/20 and one might have hoped that the Uttarakhand tragedy taught us something about increasing our flood resilience. Unfortunately, all evidence is currently poised against that hypothesis. My goal, therefore, is to effect change in the way we plan our forecasting systems, design effective inter-organization communications strategies, and use every piece of data we have to ensure that flood hazards don’t turn into horrific disasters.

The IITB-Monash Research Academy — where I have enrolled for a PhD project titled, ‘Towards a Comprehensive Data Assimilation Framework for Operational Hydrodynamic Flood Forecasting’ — is a collaboration between India and Australia that endeavours to strengthen scientific relationships between the two countries. Graduate research scholars here study for a dually-badged PhD from both IIT Bombay and Monash University, spending time at both institutions to enrich our research experience.

For decades researchers and planners have relied on mathematical models to provide actionable flood forecasts, which are nothing but approximations of the real world flow processes expressed as equations. These models were previously trained on historical flow data collected using river gauges and validated using the same, usually at the channel outlet. However, the global decline in gauge networks in addition to the changing nature of the flow characteristics due to climate change, have rendered the use of such approaches insufficient. Moreover, gauge validation approaches for the channel seldom analyze the performance of the model in the floodplain, which is counterintuitive, as that’s where we need timely forecasts and warnings.

Although gauge networks are getting sparser, alternative data sources like earth observation and citizen sensing observations from social media are rapidly becoming ubiquitous. Big data is already driving everything from football strategies to elections. It’s time to unleash the full potential of such datasets for humanitarian causes such as disaster management. The advent of satellite technology revolutionized the flood mapping process by providing synoptic observations of flood extent in near real time. What was unexpected was its phenomenal potential for flood forecasting applications.

In the last decade, several studies used satellite-derived flood depths to train flood models and further to improve their predictive skill. These studies primarily relied on an indirect estimation of flood water levels by intersecting the flood boundary with floodplain topography. Extracting these water levels requires making several simplistic assumptions which adds significant uncertainty to the model domain. The direct integration of satellite-derived flood extents into flood models, however, could potentially help to avoid adding this uncertainty, as the inundated area is directly observable through satellites. Further, we compensate for the lack of flood depth information from satellites, by integrating crowd-sourced water level observations. By minimizing the number of additional assumptions about the data, we hope to reduce the forecast uncertainty further.

In the course of our work, we are hoping that more accurate flood maps with corresponding estimates of uncertainty can provide flood managers and first responders with useful tools for efficient mitigation efforts, which will benefit millions of people residing in floodplains. Further, they can be useful to substantiate insurance claims and help define premium amounts for risk insurance.



Figure 2. An advanced mathematical tool called data assimilation is used to integrate flood information from physical models, satellite data, and social media to arrive at better flood forecasts.

We use a combination of satellite data and crowd-sourcing to improve model forecasting skills which could potentially mean the difference between life and death for people living in low-lying flood prone areas. Studies integrating remotely sensed flood depths with models, have reported an increase in accuracy for forecasts made more than 10 days in advance, which could positively impact disaster preparedness. In addition to improving the forecasts, we have also worked on improving the flood mapping process from satellite data, where our approach reduced the errors by more than half in some regions! We introduced a novel fuzzy flood mapping technology, which doesn’t require any ground data for the mapping, can be easily automated, and provides the desired level of confidence in the flood mapping at each pixel.

Says Prof Murali Sastry, CEO of the IITB-Monash Research Academy, “In a world where climate change and population growth have already exposed over 2.8 billion people to flooding between 1995 and 2015, it’s baffling that we don’t have more people using the power of data for the greater good. Science is only as useful as the people it benefits, and the scientific community should work towards making research actionable for end-users. The work by researchers like Antara Dasgupta can go a long way in saving millions of lives. We wish her all success.”

Research scholar: Antara Dasgupta , IITB-Monash Research Academy

Research scholar: Towards a Comprehensive Data Assimilation Framework for Operational Hydrodynamic Flood Forecasting

Research scholar: Dr. RAAJ Ramsankaran and Prof. Jeffrey P. Walker

Research scholar: antara.dasgupta@monash.edu

This story was written by Antara Dasgupta. Copyright IITB-Monash Research Academy