A study of image recovery techniques for radio long baseline interferometry
This thesis is concerned with data processing techniques used to form images from long baseline interferometers (LBI). The basic problems to be overcome in the imaging process are to correct for the large phase errors introduced by the atmosphere and to allow for the sparse sampling of the aperture....
Main Author: | |
---|---|
Language: | English |
Published: |
University of British Columbia
2010
|
Online Access: | http://hdl.handle.net/2429/25637 |
id |
ndltd-UBC-oai-circle.library.ubc.ca-2429-25637 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UBC-oai-circle.library.ubc.ca-2429-256372018-01-05T17:43:15Z A study of image recovery techniques for radio long baseline interferometry Steer, David G. This thesis is concerned with data processing techniques used to form images from long baseline interferometers (LBI). The basic problems to be overcome in the imaging process are to correct for the large phase errors introduced by the atmosphere and to allow for the sparse sampling of the aperture. The objectives of the study were to determine the relative importance of the constraints placed on the image and to look for improvements to the process. It is concluded that only a limited class of objects can be imaged. These objects must consist of separated features which remain separated in the zero-phase (or auto-correlation) image. Positivity and confinement were determined to be the most significant constraints for the imaging process. Closure-phase information is of secondary importance and is used to extend the dynamic range of the image once the main features are established. The main agent for enforcing the confinement constraint is the "CLEAN" algorithm. In the recovery of objects with extended features, the imaging process fails due to the well known inability of CLEAN to correctly restore these features. A new CLEAN algorithm has been developed which is stable for extended objects. This algorithm will be of interest to many synthesis telescope users as it not only yields better results but it is also significantly faster and easier to use than the earlier algorithms. Other experiments and analysis have shown that imaging processes which operate by maximizing a global image parameter such as sharpness or entropy cannot be used for imaging with large phase errors. This class of algorithms will always be confused by the zero-phase image. It was possible, however, to combine the global technique of maximum entropy with the additional constraints of confinement and closure-phase to form a working algorithm. Applied Science, Faculty of Electrical and Computer Engineering, Department of Graduate 2010-06-13T16:23:44Z 2010-06-13T16:23:44Z 1983 Text Thesis/Dissertation http://hdl.handle.net/2429/25637 eng For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use. University of British Columbia |
collection |
NDLTD |
language |
English |
sources |
NDLTD |
description |
This thesis is concerned with data processing techniques used to form images from long baseline interferometers (LBI). The basic problems to be overcome in the imaging process are to correct for the large phase errors introduced by the atmosphere and to allow for the sparse sampling of the aperture. The objectives of the study were to determine the relative importance of the constraints placed on the image and to look for improvements to the process. It is concluded that only a limited class of objects can be imaged. These objects must consist of separated features which remain separated in the zero-phase (or auto-correlation) image. Positivity and confinement were determined to be the most significant constraints for the imaging process. Closure-phase information is of secondary importance and is used to extend the dynamic range of the image once the main features are established. The main agent for enforcing the confinement constraint is the "CLEAN" algorithm. In the recovery of objects with extended features, the imaging process fails due to the well known inability of CLEAN to correctly restore these features. A new CLEAN algorithm has been developed which is stable for extended objects. This algorithm will be of interest to many synthesis telescope users as it not only yields better results but it is also significantly faster and easier to use than the earlier algorithms. Other experiments and analysis have shown that imaging processes which operate by maximizing a global image parameter such as sharpness or entropy cannot be used for imaging with large phase errors. This class of algorithms will always be confused by the zero-phase image. It was possible, however, to combine the global technique of maximum entropy with the additional constraints of confinement and closure-phase to form a working algorithm. === Applied Science, Faculty of === Electrical and Computer Engineering, Department of === Graduate |
author |
Steer, David G. |
spellingShingle |
Steer, David G. A study of image recovery techniques for radio long baseline interferometry |
author_facet |
Steer, David G. |
author_sort |
Steer, David G. |
title |
A study of image recovery techniques for radio long baseline interferometry |
title_short |
A study of image recovery techniques for radio long baseline interferometry |
title_full |
A study of image recovery techniques for radio long baseline interferometry |
title_fullStr |
A study of image recovery techniques for radio long baseline interferometry |
title_full_unstemmed |
A study of image recovery techniques for radio long baseline interferometry |
title_sort |
study of image recovery techniques for radio long baseline interferometry |
publisher |
University of British Columbia |
publishDate |
2010 |
url |
http://hdl.handle.net/2429/25637 |
work_keys_str_mv |
AT steerdavidg astudyofimagerecoverytechniquesforradiolongbaselineinterferometry AT steerdavidg studyofimagerecoverytechniquesforradiolongbaselineinterferometry |
_version_ |
1718592863270338560 |