Mobile manipulation in unstructured environments with haptic sensing and compliant joints
We make two main contributions in this thesis. First, we present our approach to robot manipulation, which emphasizes the benefits of making contact with the world across all the surfaces of a manipulator with whole-arm tactile sensing and compliant actuation at the joints. In contrast, many current...
Main Author: | |
---|---|
Published: |
Georgia Institute of Technology
2013
|
Subjects: | |
Online Access: | http://hdl.handle.net/1853/45788 |
id |
ndltd-GATECH-oai-smartech.gatech.edu-1853-45788 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-GATECH-oai-smartech.gatech.edu-1853-457882013-05-30T03:05:55ZMobile manipulation in unstructured environments with haptic sensing and compliant jointsJain, AdvaitMobile manipulationHapticsCompliant controlTouchRobots MotionRobots ProgrammingRoboticsAutomationWe make two main contributions in this thesis. First, we present our approach to robot manipulation, which emphasizes the benefits of making contact with the world across all the surfaces of a manipulator with whole-arm tactile sensing and compliant actuation at the joints. In contrast, many current approaches to mobile manipulation assume most contact is a failure of the system, restrict contact to only occur at well modeled end effectors, and use stiff, precise control to avoid contact. We develop a controller that enables robots with whole-arm tactile sensing and compliant actuation at the joints to reach to locations in high clutter while regulating contact forces. We assume that low contact forces are benign and our controller does not place any penalty on contact forces below a threshold. Our controller only requires haptic sensing, handles multiple contacts across the surface of the manipulator, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one, and a linear quasi-static mechanical model that it constructs at each time step. We show that our controller enables both a real and simulated robots to reach goal locations in high clutter with low contact forces. While doing so, the robots bend, compress, slide, and pivot around objects. To enable experiments on real robots, we also developed an inexpensive, flexible, and stretchable tactile sensor and covered large surfaces of two robot arms with these sensors. With an informal experiment, we show that our controller and sensor have the potential to enable robots to manipulate in close proximity to, and in contact with humans while keeping the contact forces low. Second, we present an approach to give robots common sense about everyday forces in the form of probabilistic data-driven object-centric models of haptic interactions. These models can be shared by different robots for improved manipulation performance. We use pulling open doors, an important task for service robots, as an example to demonstrate our approach. Specifically, we capture and model the statistics of forces while pulling open doors and drawers. Using a portable custom force and motion capture system, we create a database of forces as human operators pull open doors and drawers in six homes and one office. We then build data-driven models of the expected forces while opening a mechanism, given knowledge of either its class (e.g, refrigerator) or the mechanism identity (e.g, a particular cabinet in Advait's kitchen). We demonstrate that these models can enable robots to detect anomalous conditions such as a locked door, or collisions between the door and the environment faster and with lower excess force applied to the door compared to methods that do not use a database of forces.Georgia Institute of Technology2013-01-17T21:04:46Z2013-01-17T21:04:46Z2012-08-22Dissertationhttp://hdl.handle.net/1853/45788 |
collection |
NDLTD |
sources |
NDLTD |
topic |
Mobile manipulation Haptics Compliant control Touch Robots Motion Robots Programming Robotics Automation |
spellingShingle |
Mobile manipulation Haptics Compliant control Touch Robots Motion Robots Programming Robotics Automation Jain, Advait Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
description |
We make two main contributions in this thesis. First, we present our approach to robot manipulation, which emphasizes the benefits of making contact with the world across all the surfaces of a manipulator with whole-arm tactile sensing and compliant actuation at the joints. In contrast, many current approaches to mobile manipulation assume most contact is a failure of the system, restrict contact to only occur at well modeled end effectors, and use stiff, precise control to avoid contact.
We develop a controller that enables robots with whole-arm tactile sensing and compliant actuation at the joints to reach to locations in high clutter while regulating contact forces. We assume
that low contact forces are benign and our controller does not place any penalty on contact forces below a threshold. Our controller only requires haptic sensing, handles multiple contacts across the surface of the manipulator, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one, and a linear quasi-static mechanical model that it constructs at each time step.
We show that our controller enables both a real and simulated robots to reach goal locations in high clutter with low contact forces. While doing so, the robots bend, compress, slide, and pivot around objects. To enable experiments on real robots, we also developed an inexpensive, flexible, and stretchable tactile sensor and covered large surfaces of two robot arms with these sensors. With an informal experiment, we show that our controller and sensor have the potential to enable robots to manipulate in close proximity to, and in contact with humans while keeping the contact forces low.
Second, we present an approach to give robots common sense about everyday forces in the form of probabilistic data-driven object-centric models of haptic interactions. These models can be shared by different robots for improved manipulation performance. We use pulling open doors, an important task for service robots, as an example to demonstrate our approach.
Specifically, we capture and model the statistics of forces while pulling open doors and drawers. Using a portable custom force and motion capture system, we create a database of forces as human operators pull open doors and drawers in six homes and one office. We then build data-driven
models of the expected forces while opening a mechanism, given knowledge of either its class (e.g, refrigerator) or the mechanism identity (e.g, a particular cabinet in Advait's kitchen). We demonstrate that these models can enable robots to detect anomalous conditions such as a locked door, or collisions between the door and the environment faster and with lower excess force applied to the door compared to methods that do not use a database of forces. |
author |
Jain, Advait |
author_facet |
Jain, Advait |
author_sort |
Jain, Advait |
title |
Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
title_short |
Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
title_full |
Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
title_fullStr |
Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
title_full_unstemmed |
Mobile manipulation in unstructured environments with haptic sensing and compliant joints |
title_sort |
mobile manipulation in unstructured environments with haptic sensing and compliant joints |
publisher |
Georgia Institute of Technology |
publishDate |
2013 |
url |
http://hdl.handle.net/1853/45788 |
work_keys_str_mv |
AT jainadvait mobilemanipulationinunstructuredenvironmentswithhapticsensingandcompliantjoints |
_version_ |
1716585980230631424 |