Multi-Scale Bushfire Detection From Multi-Modal Streams of Remote Sensing Data

Bushfire is a destructive force that would change the course of a country and even the Earth. They are causing casualties and affect the quality of life of millions of people. Governments are calling for remote sensing methods to monitor and detect active bushfires around-the-clock. To fulfill this...

Full description

Bibliographic Details
Main Authors: Thanh Cong Phan, Thanh Tam Nguyen, Thanh Dat Hoang, Quoc Viet Hung Nguyen, Jun Jo
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9303404/
Description
Summary:Bushfire is a destructive force that would change the course of a country and even the Earth. They are causing casualties and affect the quality of life of millions of people. Governments are calling for remote sensing methods to monitor and detect active bushfires around-the-clock. To fulfill this call, we develop a remote sensing framework on top of imagery satellite streams to monitor and detect bushfire timely before beyond control. However, detecting bushfires from satellite images needs to take into account several aspects including spatial pattern of fire-positive pixels, temporal dependencies, spectral correlation between channels, and adversarial effects. In this article, we propose a multi-scale deep neural network model that combines both satellite images and weather data for detecting and locating bushfires at both image and pixel level. We illustrate that the weather information with careful spatio-temporal alignment can be utilised to augment imagery data. Experiments on real-world datasets show that the proposed model is better than the baselines with 93.4% accuracy and detects bushfires 1.2 times faster. It is also robust to the effects of cloud and night-time.
ISSN:2169-3536