An Innovative Curvelet-only-Based Approach for Automated Change Detection in Multi-Temporal SAR Imagery

This paper presents a novel approach for automated image comparison and robust change detection from noisy imagery, such as synthetic aperture radar (SAR) amplitude images. Instead of comparing pixel values and/or pre-classified features this approach clearly highlights structural changes without an...

Full description

Bibliographic Details
Main Authors: Andreas Schmitt, Birgit Wessel, Achim Roth
Format: Article
Language:English
Published: MDPI AG 2014-03-01
Series:Remote Sensing
Subjects:
Online Access:http://www.mdpi.com/2072-4292/6/3/2435
Description
Summary:This paper presents a novel approach for automated image comparison and robust change detection from noisy imagery, such as synthetic aperture radar (SAR) amplitude images. Instead of comparing pixel values and/or pre-classified features this approach clearly highlights structural changes without any preceding segmentation or classification step. The crucial point is the use of the Curvelet transform in order to express the image as composition of several structures instead of numerous individual pixels. Differentiating these structures and weighting their impact according to the image statistics produces a smooth, but detail-preserved change image. The Curvelet-based approach is validated by the standard technique for SAR change detection, the log-ratio with and without additional gamma maximum-a-posteriori (GMAP) speckle filtering, and by the results of human interpreters. The validation proves that the new technique can easily compete with these automated as well as visual interpretation techniques. Finally, a sequence of TerraSAR-X High Resolution Spotlight images of a factory building construction site near Ludwigshafen (Germany) is processed in order to identify single construction stages by the time of the (dis-)appearance of certain objects. Hence, the complete construction monitoring of the whole building and its surroundings becomes feasible.
ISSN:2072-4292