Summary: | The use of imagery from small unmanned aircraft systems (sUAS) has enabled the production of more accurate data about the effects of wildland fire, enabling land managers to make more informed decisions. The ability to detect trees in hyperspatial imagery enables the calculation of canopy cover. A comparison of hyperspatial post-fire canopy cover and pre-fire canopy cover from sources such as the LANDFIRE project enables the calculation of tree mortality, which is a major indicator of burn severity. A mask region-based convolutional neural network was trained to classify trees as groups of pixels from a hyperspatial orthomosaic acquired with a small unmanned aircraft system. The tree classification is summarized at 30 m, resulting in a canopy cover raster. A post-fire canopy cover is then compared to LANDFIRE canopy cover preceding the fire, calculating how much the canopy was reduced due to the fire. Canopy reduction allows the mapping of burn severity while also identifying where surface, passive crown, and active crown fire occurred within the burn perimeter. Canopy cover mapped through this effort was lower than the LANDFIRE Canopy Cover product, which literature indicated is typically over reported. Assessment of canopy reduction mapping on a wildland fire reflects observations made both from ground truthing efforts as well as observations made of the associated hyperspatial sUAS orthomosaic.
|