Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.

For mobile robots to navigate autonomously, one of the most important and challenging task is localisation. Localisation refers to the process whereby a robot locates itself within a map of a known environment or with respect to a known starting point within an unknown environment. Localisation of a...

Full description

Bibliographic Details
Main Author: Ratshidaho, Thikhathali Terence.
Other Authors: Tapamo, Jules-Raymond.
Language:en_ZA
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10413/11181
id ndltd-netd.ac.za-oai-union.ndltd.org-ukzn-oai-http---researchspace.ukzn.ac.za-10413-11181
record_format oai_dc
spelling ndltd-netd.ac.za-oai-union.ndltd.org-ukzn-oai-http---researchspace.ukzn.ac.za-10413-111812014-09-13T04:02:15ZFusion of time of flight (ToF) camera's ego-motion and inertial navigation.Ratshidaho, Thikhathali Terence.Mobile robots.Robots--Control systems.Robots--Motion.Theses--Computer engineering.For mobile robots to navigate autonomously, one of the most important and challenging task is localisation. Localisation refers to the process whereby a robot locates itself within a map of a known environment or with respect to a known starting point within an unknown environment. Localisation of a robot in unknown environment is done by tracking the trajectory of a robot whilst knowing the initial pose. Trajectory estimation becomes challenging if the robot is operating in an unknown environment that has scarcity of landmarks, is GPS denied, is slippery and dark such as in underground mines. This dissertation addresses the problem of estimating a robot's trajectory in underground mining environments. In the past, this problem has been addressed by using a 3D laser scanner. 3D laser scanners are expensive and consume lot of power even though they have high measurements accuracy and wide eld of view. For this research work, trajectory estimation is accomplished by the fusion of an ego-motion provided by Time of Flight(ToF) camera and measurement data provided by a low cost Inertial Measurement Unit(IMU). The fusion is performed using Kalman lter algorithm on a mobile robot moving in a 2D planar surface. The results shows a signi cant improvement on the trajectory estimation. Trajectory estimation using ToF camera only is erroneous especially when the robot is rotating. The fused trajectory estimation algorithm is able to estimate accurate ego-motion even when the robot is rotating.[Durban, South Africa] : University of KwaZulu-Natal, 2013.Tapamo, Jules-Raymond.2014-09-12T06:01:39Z2014-09-12T06:01:39Z20132014-09-12Thesishttp://hdl.handle.net/10413/11181en_ZA
collection NDLTD
language en_ZA
sources NDLTD
topic Mobile robots.
Robots--Control systems.
Robots--Motion.
Theses--Computer engineering.
spellingShingle Mobile robots.
Robots--Control systems.
Robots--Motion.
Theses--Computer engineering.
Ratshidaho, Thikhathali Terence.
Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
description For mobile robots to navigate autonomously, one of the most important and challenging task is localisation. Localisation refers to the process whereby a robot locates itself within a map of a known environment or with respect to a known starting point within an unknown environment. Localisation of a robot in unknown environment is done by tracking the trajectory of a robot whilst knowing the initial pose. Trajectory estimation becomes challenging if the robot is operating in an unknown environment that has scarcity of landmarks, is GPS denied, is slippery and dark such as in underground mines. This dissertation addresses the problem of estimating a robot's trajectory in underground mining environments. In the past, this problem has been addressed by using a 3D laser scanner. 3D laser scanners are expensive and consume lot of power even though they have high measurements accuracy and wide eld of view. For this research work, trajectory estimation is accomplished by the fusion of an ego-motion provided by Time of Flight(ToF) camera and measurement data provided by a low cost Inertial Measurement Unit(IMU). The fusion is performed using Kalman lter algorithm on a mobile robot moving in a 2D planar surface. The results shows a signi cant improvement on the trajectory estimation. Trajectory estimation using ToF camera only is erroneous especially when the robot is rotating. The fused trajectory estimation algorithm is able to estimate accurate ego-motion even when the robot is rotating. === [Durban, South Africa] : University of KwaZulu-Natal, 2013.
author2 Tapamo, Jules-Raymond.
author_facet Tapamo, Jules-Raymond.
Ratshidaho, Thikhathali Terence.
author Ratshidaho, Thikhathali Terence.
author_sort Ratshidaho, Thikhathali Terence.
title Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
title_short Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
title_full Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
title_fullStr Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
title_full_unstemmed Fusion of time of flight (ToF) camera's ego-motion and inertial navigation.
title_sort fusion of time of flight (tof) camera's ego-motion and inertial navigation.
publishDate 2014
url http://hdl.handle.net/10413/11181
work_keys_str_mv AT ratshidahothikhathaliterence fusionoftimeofflighttofcamerasegomotionandinertialnavigation
_version_ 1716714169111150592