A MANY-TO-MANY FULLY CONVOLUTIONAL RECURRENT NETWORK FOR MULTITEMPORAL CROP RECOGNITION
Recently, recurrent neural networks have been proposed for crop mapping from multitemporal remote sensing data. Most of these proposals have been designed and tested in temperate regions, where a single harvest per season is the rule. In tropical regions, the favorable climate and local agricultural...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Copernicus Publications
2019-09-01
|
Series: | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Online Access: | https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/IV-2-W7/25/2019/isprs-annals-IV-2-W7-25-2019.pdf |
Summary: | Recently, recurrent neural networks have been proposed for crop mapping from multitemporal remote sensing data. Most of these proposals have been designed and tested in temperate regions, where a single harvest per season is the rule. In tropical regions, the favorable climate and local agricultural practices, such as crop rotation, result in more complex spatio-temporal dynamics, where the single harvest per season assumption does not hold. In this context, a demand arises for methods capable of recognizing agricultural crops at multiple dates along the multitemporal sequence. In the present work, we propose to adapt two recurrent neural networks, originally conceived for single harvest per season, for multidate crop recognition. In addition, we propose a novel multidate approach based on bidirectional fully convolutional recurrent neural networks. These three architectures were evaluated on public Sentinel-1 data sets from two tropical regions in Brazil. In our experiments, all methods achieved state-of-the-art accuracies with a clear superiority of the proposed architecture. It outperformed its counterparts in up to 3.8% and 7.4%, in terms of per-month overall accuracy, and it was the best performing method in terms of F1-score for most crops and dates on both regions. |
---|---|
ISSN: | 2194-9042 2194-9050 |