Navigation and Self-Semantic Location of Drones in Indoor Environments by Combining the Visual Bug Algorithm and Entropy-Based Vision
We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in in...
Main Authors: | Darío Maravall, Javier de Lope, Juan P. Fuentes |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2017-08-01
|
Series: | Frontiers in Neurorobotics |
Subjects: | |
Online Access: | http://journal.frontiersin.org/article/10.3389/fnbot.2017.00046/full |
Similar Items
-
Drone Journalism as Visual Aggregation: Toward a Critical History
by: James F. Hamilton
Published: (2020-07-01) -
Visual SLAM for Indoor Livestock and Farming Using a Small Drone with a Monocular Camera: A Feasibility Study
by: Sander Krul, et al.
Published: (2021-05-01) -
A survey on vision-based UAV navigation
by: Yuncheng Lu, et al.
Published: (2018-01-01) -
Monocular vision-aided inertial navigation for unmanned aerial vehicles
by: Magree, Daniel Paul
Published: (2015) -
Autonomous Vision-Based Aerial Grasping for Rotorcraft Unmanned Aerial Vehicles
by: Lishan Lin, et al.
Published: (2019-08-01)