Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)

碩士 === 國立臺灣師範大學 === 應用電子科技學系 === 100 === This paper presents a method for depth measurement based on Speeded Up Robust Features (SURF) and pixel number variation of CCD Images. A single camera is used to capture two images in different photographing distances, where speeded up robust features in the...

Full description

Bibliographic Details
Main Authors: Zong-Han Cai, 蔡宗翰
Other Authors: Chen-Chien Hsu
Format: Others
Language:zh-TW
Published: 2011
Online Access:http://ndltd.ncl.edu.tw/handle/26750087644131777790
id ndltd-TW-100NTNU5427019
record_format oai_dc
spelling ndltd-TW-100NTNU54270192016-03-28T04:20:22Z http://ndltd.ncl.edu.tw/handle/26750087644131777790 Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF) 結合像素差異法與SURF之景深量測系統 Zong-Han Cai 蔡宗翰 碩士 國立臺灣師範大學 應用電子科技學系 100 This paper presents a method for depth measurement based on Speeded Up Robust Features (SURF) and pixel number variation of CCD Images. A single camera is used to capture two images in different photographing distances, where speeded up robust features in the images are extracted and matched. To remove mismatches from given putative point correspondences, an Identifying point correspondences by Correspondence Function (ICF) method is adopted in order to automatically select better reference points required by the pixel number variation method. Based on the displacement of the camera at two photographing distances, feature points of the objects in the images are used to determine the distance measurement of the target objects. After that, we use the obtained distance information of the feature points of the target objects to construct the depth map by using smooth interpolation. Chen-Chien Hsu 許陳鑑 2011 學位論文 ; thesis 54 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立臺灣師範大學 === 應用電子科技學系 === 100 === This paper presents a method for depth measurement based on Speeded Up Robust Features (SURF) and pixel number variation of CCD Images. A single camera is used to capture two images in different photographing distances, where speeded up robust features in the images are extracted and matched. To remove mismatches from given putative point correspondences, an Identifying point correspondences by Correspondence Function (ICF) method is adopted in order to automatically select better reference points required by the pixel number variation method. Based on the displacement of the camera at two photographing distances, feature points of the objects in the images are used to determine the distance measurement of the target objects. After that, we use the obtained distance information of the feature points of the target objects to construct the depth map by using smooth interpolation.
author2 Chen-Chien Hsu
author_facet Chen-Chien Hsu
Zong-Han Cai
蔡宗翰
author Zong-Han Cai
蔡宗翰
spellingShingle Zong-Han Cai
蔡宗翰
Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
author_sort Zong-Han Cai
title Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
title_short Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
title_full Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
title_fullStr Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
title_full_unstemmed Depth Measurement Based on Pixel Number Variation and Speeded Up Robust Features (SURF)
title_sort depth measurement based on pixel number variation and speeded up robust features (surf)
publishDate 2011
url http://ndltd.ncl.edu.tw/handle/26750087644131777790
work_keys_str_mv AT zonghancai depthmeasurementbasedonpixelnumbervariationandspeededuprobustfeaturessurf
AT càizōnghàn depthmeasurementbasedonpixelnumbervariationandspeededuprobustfeaturessurf
AT zonghancai jiéhéxiàngsùchàyìfǎyǔsurfzhījǐngshēnliàngcèxìtǒng
AT càizōnghàn jiéhéxiàngsùchàyìfǎyǔsurfzhījǐngshēnliàngcèxìtǒng
_version_ 1718213036019286016