Development of Global Vision Positioning System for Pseudo-rigid Formation Control
碩士 === 國立臺灣大學 === 應用力學研究所 === 104 === The purpose of this thesis is to develop a global vision system for the pseudo-rigid formation control of a multi-vehicle system. An IP camera installed on the roof(three-meter high) was used to provide the position coordinates and attitude angles of the vehicle...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2016
|
Online Access: | http://ndltd.ncl.edu.tw/handle/34283458265172647056 |
id |
ndltd-TW-104NTU05499021 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-104NTU054990212017-04-29T04:31:56Z http://ndltd.ncl.edu.tw/handle/34283458265172647056 Development of Global Vision Positioning System for Pseudo-rigid Formation Control 發展用於擬剛體編隊控制之全域視覺定位系統 Jer-Wen Huang 黃哲文 碩士 國立臺灣大學 應用力學研究所 104 The purpose of this thesis is to develop a global vision system for the pseudo-rigid formation control of a multi-vehicle system. An IP camera installed on the roof(three-meter high) was used to provide the position coordinates and attitude angles of the vehicles. With camera calibration and vision positioning algorithm, we use this vision system to gather prior information for each vehicle and conduct the experiment by integrating hardware and software. The algorithm we applied to design the formation of vehicles is based on the pseudo-rigid body theory, which is determined by a homogenous deformation tensor such that rotation, stretch, and shear are allowed. This formation has a better adaptability to complex environment comparing to the rigid-body formation. The Rapidly-Exploring Random Tree (RRT) method along with the techniques of route adjustment were adopted to obtain the path of formation center. The deformation matrix is then found by using the method of virtual potential function, from which the route of each vehicle is computed. In the experiments, we used M3006V IP camera to detect the locations and attitude angles of each vehicle, desired distances and angles that the vehicles need to move to form a pseudo-rigid formation are calculated. These information is then transfer to the base station to initialize the pseudo-rigid formation. After the initialization of the pseudo-rigid formation, Microsoft’s Kinect sensor was used to detect the environment. The path-planning algorithm on the leader computer was then invoked to obtain the path of the formation center. The pseudo-rigid formation design algorithm based on the virtual potential theory was conducted to yield the desired path of each vehicle. The camera can monitor the real route and real environment behind obstacles, so that the adaptivity of pseudo-rigid formation control can be improved. Li-Sheng Wang 王立昇 2016 學位論文 ; thesis 70 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 應用力學研究所 === 104 === The purpose of this thesis is to develop a global vision system for the pseudo-rigid formation control of a multi-vehicle system. An IP camera installed on the roof(three-meter high) was used to provide the position coordinates and attitude angles of the vehicles. With camera calibration and vision positioning algorithm, we use this vision system to gather prior information for each vehicle and conduct the experiment by integrating hardware and software.
The algorithm we applied to design the formation of vehicles is based on the pseudo-rigid body theory, which is determined by a homogenous deformation tensor such that rotation, stretch, and shear are allowed. This formation has a better adaptability to complex environment comparing to the rigid-body formation. The Rapidly-Exploring Random Tree (RRT) method along with the techniques of route adjustment were adopted to obtain the path of formation center. The deformation matrix is then found by using the method of virtual potential function, from which the route of each vehicle is computed.
In the experiments, we used M3006V IP camera to detect the locations and attitude angles of each vehicle, desired distances and angles that the vehicles need to move to form a pseudo-rigid formation are calculated. These information is then transfer to the base station to initialize the pseudo-rigid formation.
After the initialization of the pseudo-rigid formation, Microsoft’s Kinect sensor was used to detect the environment. The path-planning algorithm on the leader computer was then invoked to obtain the path of the formation center. The pseudo-rigid formation design algorithm based on the virtual potential theory was conducted to yield the desired path of each vehicle. The camera can monitor the real route and real environment behind obstacles, so that the adaptivity of pseudo-rigid formation control can be improved.
|
author2 |
Li-Sheng Wang |
author_facet |
Li-Sheng Wang Jer-Wen Huang 黃哲文 |
author |
Jer-Wen Huang 黃哲文 |
spellingShingle |
Jer-Wen Huang 黃哲文 Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
author_sort |
Jer-Wen Huang |
title |
Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
title_short |
Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
title_full |
Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
title_fullStr |
Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
title_full_unstemmed |
Development of Global Vision Positioning System for Pseudo-rigid Formation Control |
title_sort |
development of global vision positioning system for pseudo-rigid formation control |
publishDate |
2016 |
url |
http://ndltd.ncl.edu.tw/handle/34283458265172647056 |
work_keys_str_mv |
AT jerwenhuang developmentofglobalvisionpositioningsystemforpseudorigidformationcontrol AT huángzhéwén developmentofglobalvisionpositioningsystemforpseudorigidformationcontrol AT jerwenhuang fāzhǎnyòngyúnǐgāngtǐbiānduìkòngzhìzhīquányùshìjuédìngwèixìtǒng AT huángzhéwén fāzhǎnyòngyúnǐgāngtǐbiānduìkòngzhìzhīquányùshìjuédìngwèixìtǒng |
_version_ |
1718445753806880768 |