Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning
This paper addresses the problems related to the mapless navigation control of wheeled mobile robots based on deep learning technology. The traditional navigation control framework is based on a global map of the environment, and its navigation performance depends on the quality of the global map. I...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9521224/ |
id |
doaj-77a6544162b748059f7d24c863a6deb0 |
---|---|
record_format |
Article |
spelling |
doaj-77a6544162b748059f7d24c863a6deb02021-08-27T23:01:12ZengIEEEIEEE Access2169-35362021-01-01911752711754110.1109/ACCESS.2021.31070419521224Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation LearningChi-Yi Tsai0https://orcid.org/0000-0001-9872-4338Humaira Nisar1https://orcid.org/0000-0003-2026-5666Yu-Chen Hu2https://orcid.org/0000-0003-1872-9214Department of Electrical and Computer Engineering, Tamkang University, Tamsui, New Taipei City, TaiwanDepartment of Electronic Engineering, Universiti Tunku Abdul Rahman, Kampar, Perak, MalaysiaDepartment of Electrical and Computer Engineering, Tamkang University, Tamsui, New Taipei City, TaiwanThis paper addresses the problems related to the mapless navigation control of wheeled mobile robots based on deep learning technology. The traditional navigation control framework is based on a global map of the environment, and its navigation performance depends on the quality of the global map. In this paper, we proposes a mapless Light Detection and Ranging (LiDAR) navigation control method for wheeled mobile robots based on deep imitation learning. The proposed method is a data-driven control method that directly uses LiDAR sensors and relative target position for mobile robot navigation control. A deep convolutional neural network (CNN) model is proposed to predict motion control commands of the mobile robot without the requirement of the global map to achieve navigation control of the mobile robot in unknown environments. While collecting the training dataset, we manipulated the mobile robot to avoid obstacles through manual control and recorded the raw data of the LiDAR sensor, the relative target position, and the corresponding motion control commands. Next, we applied a data augmentation method on the recorded samples to increase the number of training samples in the dataset. In the network model design, the proposed CNN model consists of a LiDAR CNN module to extract LiDAR features and a motion prediction module to predict the motion behavior of the robot. In the model training phase, the proposed CNN model learns the mapping between the input sensor data and the desired motion behavior through end-to-end imitation learning. Experimental results show that the proposed mapless LiDAR navigation control method can safely navigate the mobile robot in four unseen environments with an average success rate of 75%. Therefore, the proposed mapless LiDAR navigation control system is effective for robot navigation control in an unknown environment without the global map.https://ieeexplore.ieee.org/document/9521224/Deep imitation learningend-to-end learningmapless LiDAR navigation controlbehavior cloning |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Chi-Yi Tsai Humaira Nisar Yu-Chen Hu |
spellingShingle |
Chi-Yi Tsai Humaira Nisar Yu-Chen Hu Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning IEEE Access Deep imitation learning end-to-end learning mapless LiDAR navigation control behavior cloning |
author_facet |
Chi-Yi Tsai Humaira Nisar Yu-Chen Hu |
author_sort |
Chi-Yi Tsai |
title |
Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning |
title_short |
Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning |
title_full |
Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning |
title_fullStr |
Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning |
title_full_unstemmed |
Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning |
title_sort |
mapless lidar navigation control of wheeled mobile robots based on deep imitation learning |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2021-01-01 |
description |
This paper addresses the problems related to the mapless navigation control of wheeled mobile robots based on deep learning technology. The traditional navigation control framework is based on a global map of the environment, and its navigation performance depends on the quality of the global map. In this paper, we proposes a mapless Light Detection and Ranging (LiDAR) navigation control method for wheeled mobile robots based on deep imitation learning. The proposed method is a data-driven control method that directly uses LiDAR sensors and relative target position for mobile robot navigation control. A deep convolutional neural network (CNN) model is proposed to predict motion control commands of the mobile robot without the requirement of the global map to achieve navigation control of the mobile robot in unknown environments. While collecting the training dataset, we manipulated the mobile robot to avoid obstacles through manual control and recorded the raw data of the LiDAR sensor, the relative target position, and the corresponding motion control commands. Next, we applied a data augmentation method on the recorded samples to increase the number of training samples in the dataset. In the network model design, the proposed CNN model consists of a LiDAR CNN module to extract LiDAR features and a motion prediction module to predict the motion behavior of the robot. In the model training phase, the proposed CNN model learns the mapping between the input sensor data and the desired motion behavior through end-to-end imitation learning. Experimental results show that the proposed mapless LiDAR navigation control method can safely navigate the mobile robot in four unseen environments with an average success rate of 75%. Therefore, the proposed mapless LiDAR navigation control system is effective for robot navigation control in an unknown environment without the global map. |
topic |
Deep imitation learning end-to-end learning mapless LiDAR navigation control behavior cloning |
url |
https://ieeexplore.ieee.org/document/9521224/ |
work_keys_str_mv |
AT chiyitsai maplesslidarnavigationcontrolofwheeledmobilerobotsbasedondeepimitationlearning AT humairanisar maplesslidarnavigationcontrolofwheeledmobilerobotsbasedondeepimitationlearning AT yuchenhu maplesslidarnavigationcontrolofwheeledmobilerobotsbasedondeepimitationlearning |
_version_ |
1721187863246143488 |