Summary: | 碩士 === 國立交通大學 === 土木工程系所 === 105 === Panorama image is a 360-degree image covering the horizontal direction. The generation of the panorama image can be divided into (1) single-camera non-synchronous taking images, and (2) multi-camera synchronous taking images. Since the single camera is limited by the problem of non-synchronous taking images, in this study, five GoPro Hero4 cameras and one Nikon KeyMission360 camera with dual fisheye lenses were used for synchronous sampling. Because of the slight time lag problem in synchronous taking images, each camera implements time synchronous before stitching panorama images. The purpose of this study is to use multi-cameras and single-camera with dual lenses to take images in a video mode, then stitch each image to panorama images in the different projection modes. The last, implement image orientation recovery by combining multi-station panorama images and ground control points to generate 3D point clouds using image matching technique.
The methodology includes four major parts, (1) panorama images generation, (2) image orientation recovery, (3) 3D point clouds generation with dense matching, and (4) Building Information Modeling (BIM) construction. First, panorama images generation is to extract tie-points from every overlapped image pair, so as to stitch to panorama images with the extracted tie-points. Second, image orientation recovery uses the Structure from Motion (SfM) algorithm. Third, 3D point clouds generation with dense matching is to dense match the tie points in the image space, and then calculate 3D points coordinates with the collinearity condition equation. The last, Building Information Modeling (BIM) construction is to construct modeling based on the generated 3D point clouds.
This experiment analysis includes five steps, (1) the 3D point clouds from five GoPro Hero4 cameras in the number of different stations and the different projection modes. (2) the 3D point clouds from a Nikon KeyMission360 camera with dual fisheye lens. (3) the comparison of 3D point clouds accuracy. (4) the analysis of BIM construction, and (5) the co-registration or image-based 3D point clouds and the FARO terrestrial LiDAR point clouds. The experiments show that through the comparison of the 3D point clouds accuracy between five GoPro Hero4 cameras and one Nikon KeyMission360 camera, the relative error from a length in 3D point clouds and actual line is less than 3%. Moreover, the relative error of 3D point clouds-based BIM model in length, width and height are all less than 1.01%.
|