MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION

Image Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Averag...

Full description

Bibliographic Details
Main Authors: X. Liu, Q. Yu, X. Zhang, Y. Shang, X. Zhu, Z. Lei
Format: Article
Language:English
Published: Copernicus Publications 2012-08-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B3/485/2012/isprsarchives-XXXIX-B3-485-2012.pdf
id doaj-2817fc5e82cb43baaa2562904a1f9372
record_format Article
spelling doaj-2817fc5e82cb43baaa2562904a1f93722020-11-24T21:58:57ZengCopernicus PublicationsThe International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences1682-17502194-90342012-08-01XXXIX-B348549010.5194/isprsarchives-XXXIX-B3-485-2012MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATIONX. Liu0Q. Yu1X. Zhang2Y. Shang3X. Zhu4Z. Lei5Aeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaAeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaAeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaAeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaAeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaAeronautical and Astronautical Science and Technology, National University of Defense Technology, Changsha, Hunan, ChinaImage Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Average Phase (LAP) and Local Weighted Amplitude (LWA), are presented to emphasize the common scene information, while suppressing the non-common illumination and sensor-dependent information. In order to get the two representations, local frequency information is firstly obtained from Log-Gabor wavelet transformation, which is similar to that of the human visual system; then the outputs of odd and even symmetric filters are used to construct the LAP and LWA. The LAP and LWA emphasize on the phase and amplitude information respectively. As these two representations are both derivative-free and threshold-free, they are robust to noise and can keep as much of the image details as possible. A new Compositional Similarity Measure (CSM) is also presented to combine the LAP and LWA with the same weight for measuring the similarity of multi-temporal and multi-sensor images. The CSM can make the LAP and LWA compensate for each other and can make full use of the amplitude and phase of local frequency information. In many image matching applications, the template is usually selected without consideration of its matching robustness and accuracy. In order to overcome this problem, a local best matching point detection is presented to detect the best matching template. In the detection method, we employ self-similarity analysis to identify the template with the highest matching robustness and accuracy. Experimental results using some real images and simulation images demonstrate that the presented approach is effective for matching image pairs with significant scene and illumination changes and that it has advantages over other state-of-the-art approaches, which include: the Local Frequency Response Vectors (LFRV), Phase Congruence (PC), and Four Directional-Derivative-Energy Image (FDDEI), especially when there is a low signal-to-noise ratio (SNR). As few assumptions are made, our proposed method can foreseeably be used in a wide variety of image-matching applications.https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B3/485/2012/isprsarchives-XXXIX-B3-485-2012.pdf
collection DOAJ
language English
format Article
sources DOAJ
author X. Liu
Q. Yu
X. Zhang
Y. Shang
X. Zhu
Z. Lei
spellingShingle X. Liu
Q. Yu
X. Zhang
Y. Shang
X. Zhu
Z. Lei
MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
author_facet X. Liu
Q. Yu
X. Zhang
Y. Shang
X. Zhu
Z. Lei
author_sort X. Liu
title MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
title_short MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
title_full MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
title_fullStr MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
title_full_unstemmed MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION
title_sort multi-temporal and multi-sensor image matching based on local frequency information
publisher Copernicus Publications
series The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
issn 1682-1750
2194-9034
publishDate 2012-08-01
description Image Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Average Phase (LAP) and Local Weighted Amplitude (LWA), are presented to emphasize the common scene information, while suppressing the non-common illumination and sensor-dependent information. In order to get the two representations, local frequency information is firstly obtained from Log-Gabor wavelet transformation, which is similar to that of the human visual system; then the outputs of odd and even symmetric filters are used to construct the LAP and LWA. The LAP and LWA emphasize on the phase and amplitude information respectively. As these two representations are both derivative-free and threshold-free, they are robust to noise and can keep as much of the image details as possible. A new Compositional Similarity Measure (CSM) is also presented to combine the LAP and LWA with the same weight for measuring the similarity of multi-temporal and multi-sensor images. The CSM can make the LAP and LWA compensate for each other and can make full use of the amplitude and phase of local frequency information. In many image matching applications, the template is usually selected without consideration of its matching robustness and accuracy. In order to overcome this problem, a local best matching point detection is presented to detect the best matching template. In the detection method, we employ self-similarity analysis to identify the template with the highest matching robustness and accuracy. Experimental results using some real images and simulation images demonstrate that the presented approach is effective for matching image pairs with significant scene and illumination changes and that it has advantages over other state-of-the-art approaches, which include: the Local Frequency Response Vectors (LFRV), Phase Congruence (PC), and Four Directional-Derivative-Energy Image (FDDEI), especially when there is a low signal-to-noise ratio (SNR). As few assumptions are made, our proposed method can foreseeably be used in a wide variety of image-matching applications.
url https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B3/485/2012/isprsarchives-XXXIX-B3-485-2012.pdf
work_keys_str_mv AT xliu multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
AT qyu multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
AT xzhang multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
AT yshang multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
AT xzhu multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
AT zlei multitemporalandmultisensorimagematchingbasedonlocalfrequencyinformation
_version_ 1725850124128616448