Mobile Mixed-Reality Interfaces That Enhance Human–Robot Interaction in Shared Spaces
Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human–robot interactions (HRI), they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile m...
Main Authors: | Jared A. Frank, Matthew Moorhead, Vikram Kapila |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2017-06-01
|
Series: | Frontiers in Robotics and AI |
Subjects: | |
Online Access: | http://journal.frontiersin.org/article/10.3389/frobt.2017.00020/full |
Similar Items
-
IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design
by: Dennis Krupke, et al.
Published: (2019-04-01) -
Mixing Educational Robotics, Tangibles and Mixed Reality Environments for the Interdisciplinary Learning of Geography and History
by: Stefanos Xefteris, et al.
Published: (2019-04-01) -
A Systematic Review of Virtual Reality Interfaces for Controlling and Interacting with Robots
by: Murphy Wonsick, et al.
Published: (2020-12-01) -
A Mixed-Reality Platform for Robotics and Intelligent Vehicles
by: Grünwald, Norbert
Published: (2012) -
Mixed Reality Enhanced User Interactive Path Planning for Omnidirectional Mobile Robot
by: Mulun Wu, et al.
Published: (2020-02-01)