Summary: | Here, we demonstrate how deep neural network (DNN) detections of multiple constitutive or component objects that are part of a larger, more complex, and encompassing feature can be spatially fused to improve the search, detection, and retrieval (ranking) of the larger complex feature. First, scores computed from a spatial clustering algorithm are normalized to a reference space so that they are independent of image resolution and DNN input chip size. Then, multiscale DNN detections from various component objects are fused to improve the detection and retrieval of DNN detections of a larger complex feature. We demonstrate the utility of this approach for broad area search and detection of surface-to-air missile (SAM) sites that have a very low occurrence rate (only 16 sites) over a ~90 000 km<sup>2</sup> study area in SE China. The results demonstrate that spatial fusion of multiscale componentobject DNN detections can reduce the detection error rate of SAM Sites by >85% while still maintaining a 100% recall. The novel spatial fusion approach demonstrated here can be easily extended to a wide variety of other challenging object search and detection problems in large-scale remote sensing image datasets.
|