Summary: | Bushfire is a destructive force that would change the course of a country and even the Earth. They are causing casualties and affect the quality of life of millions of people. Governments are calling for remote sensing methods to monitor and detect active bushfires around-the-clock. To fulfill this call, we develop a remote sensing framework on top of imagery satellite streams to monitor and detect bushfire timely before beyond control. However, detecting bushfires from satellite images needs to take into account several aspects including spatial pattern of fire-positive pixels, temporal dependencies, spectral correlation between channels, and adversarial effects. In this article, we propose a multi-scale deep neural network model that combines both satellite images and weather data for detecting and locating bushfires at both image and pixel level. We illustrate that the weather information with careful spatio-temporal alignment can be utilised to augment imagery data. Experiments on real-world datasets show that the proposed model is better than the baselines with 93.4% accuracy and detects bushfires 1.2 times faster. It is also robust to the effects of cloud and night-time.
|