Flood Extent Mapping: An Integrated Method Using Deep Learning and Region Growing Using UAV Optical Data

Flooding occurs frequently and causes loss of lives, and extensive damages to infrastructure and the environment. Accurate and timely mapping of flood extent to ascertain damages is critical and essential for relief activities. Recently, deep-learning-based approaches, including convolutional neural...

Full description

Bibliographic Details
Main Authors: Leila Hashemi-Beni, Asmamaw A. Gebrehiwot
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9324919/
Description
Summary:Flooding occurs frequently and causes loss of lives, and extensive damages to infrastructure and the environment. Accurate and timely mapping of flood extent to ascertain damages is critical and essential for relief activities. Recently, deep-learning-based approaches, including convolutional neural network (CNN) has shown promising results for flood extent mapping. However, these methods cannot extract floods underneath vegetation canopy using the optical imagery. This article attempts to address this problem by introducing an integrated CNN and region growing (RG) method for the mapping of both visible and underneath vegetation flooded areas. The CNN-based classifier is used to extract flooded areas from the optical images, whereas, the RG method is applied to estimate the extent of floods underneath vegetation that are not visible from imagery using the digital elevation model. A data augmentation technique is applied for training the CNN-based classifier to improve the classification results. The results show that the data augmentation can enhance the accuracy of image classification and the proposed integrated method efficiently detects floods in both the visible and the areas covered by vegetation, which is essential to supporting effective flood emergency response and recovery activities.
ISSN:2151-1535