A Virtual End-to-End Learning System for Robot Navigation Based on Temporal Dependencies

Steering a wheeled mobile robot through a variety of environments is a complex task. To achieve this, many researchers have tried to convert front-facing camera data stream to the corresponding steering angles based on convolutional neural network model (CNN). However, most of existing methods suffe...

Full description

Bibliographic Details
Main Authors: Yanqiu Zhang, Ruiquan Ge, Lei Lyu, Jinling Zhang, Chen Lyu, Xiaojuan Yang
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9144508/
Description
Summary:Steering a wheeled mobile robot through a variety of environments is a complex task. To achieve this, many researchers have tried to convert front-facing camera data stream to the corresponding steering angles based on convolutional neural network model (CNN). However, most of existing methods suffer from higher cost of data acquisition and longer training cycles. To address these issues, this paper proposes an innovative end-to-end deep neural network model that fully considers the temporal relationships in the data and incorporates long short-term memory (LSTM) based on the CNN model. In addition, to obtain enough data to train and test the model, we establish a simulation system capable of creating realistic environment with various weather and road conditions and avoiding static and dynamic obstacles for robots. First, we use the system to capture the raw image sequence in different environments as a training set, and then we test the trained model in the system to realize an autonomous mobile robot that can adapt to various environments. The experimental results demonstrate that the proposed model not only can extract effectively and fully the features of road vision information with the highest correlation for navigation, but also can learn the time dependence of motion states and image features contained in a sequence.
ISSN:2169-3536