Application and Evaluation of a Deep Learning Architecture to Urban Tree Canopy Mapping

Urban forest is a dynamic urban ecosystem that provides critical benefits to urban residents and the environment. Accurate mapping of urban forest plays an important role in greenspace management. In this study, we apply a deep learning model, the U-net, to urban tree canopy mapping using high-resol...

Full description

Bibliographic Details
Main Authors: Zhe Wang, Chao Fan, Min Xian
Format: Article
Language:English
Published: MDPI AG 2021-04-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/13/9/1749
Description
Summary:Urban forest is a dynamic urban ecosystem that provides critical benefits to urban residents and the environment. Accurate mapping of urban forest plays an important role in greenspace management. In this study, we apply a deep learning model, the U-net, to urban tree canopy mapping using high-resolution aerial photographs. We evaluate the feasibility and effectiveness of the U-net in tree canopy mapping through experiments at four spatial scales—16 cm, 32 cm, 50 cm, and 100 cm. The overall performance of all approaches is validated on the ISPRS Vaihingen 2D Semantic Labeling dataset using four quantitative metrics, Dice, Intersection over Union, Overall Accuracy, and Kappa Coefficient. Two evaluations are performed to assess the model performance. Experimental results show that the U-net with the 32-cm input images perform the best with an overall accuracy of 0.9914 and an Intersection over Union of 0.9638. The U-net achieves the state-of-the-art overall performance in comparison with object-based image analysis approach and other deep learning frameworks. The outstanding performance of the U-net indicates a possibility of applying it to urban tree segmentation at a wide range of spatial scales. The U-net accurately recognizes and delineates tree canopy for different land cover features and has great potential to be adopted as an effective tool for high-resolution land cover mapping.
ISSN:2072-4292