Interactive perception of articulated objects for autonomous manipulation

This thesis develops robotic skills for manipulating novel articulated objects. The degrees of freedom of an articulated object describe the relationship among its rigid bodies, and are often relevant to the object's intended function. Examples of everyday articulated objects include scissors,...

Full description

Bibliographic Details
Main Author: Katz, Dov
Language:ENG
Published: ScholarWorks@UMass Amherst 2011
Subjects:
Online Access:https://scholarworks.umass.edu/dissertations/AAI3482710
id ndltd-UMASS-oai-scholarworks.umass.edu-dissertations-6336
record_format oai_dc
spelling ndltd-UMASS-oai-scholarworks.umass.edu-dissertations-63362020-12-02T14:32:23Z Interactive perception of articulated objects for autonomous manipulation Katz, Dov This thesis develops robotic skills for manipulating novel articulated objects. The degrees of freedom of an articulated object describe the relationship among its rigid bodies, and are often relevant to the object's intended function. Examples of everyday articulated objects include scissors, pliers, doors, door handles, books, and drawers. Autonomous manipulation of articulated objects is therefore a prerequisite for many robotic applications in our everyday environments. Already today, robots perform complex manipulation tasks, with impressive accuracy and speed, in controlled environments such as factory floors. An important characteristic of these environments is that they can be engineered to reduce or even eliminate perception. In contrast, in unstructured environments such as our homes and offices, perception is typically much more challenging. Indeed, manipulation in these unstructured environments remains largely unsolved. We therefore assume that to enable autonomous manipulation of objects in our everyday environments, robots must be able to acquire information about these objects, making as few assumption about the environment as possible. Acquiring information about the world from sensor data is a challenging problem. Because there is so much information that could be measured about the environment, considering all of it is impractical given current computational speeds. Instead, we propose to leverage our understanding of the task, in order to determine the relevant information. In our case, this information consists of the object's shape and kinematic structure. Perceiving this task-specific information is still challenging. This is because in order to understand the object's degrees of freedom, we must observe relative motion between its rigid bodies. And, as relative motion is not guaranteed to occur, this information may not be included in the sensor stream. The main contribution of this thesis is the design and implementation of a robotic system capable of perceiving and manipulating articulated objects. This system relies on Interactive Perception, an approach which exploits the synergies that arise when crossing the boundary between action and perception. In interactive perception, the emphasis of perception shifts from object appearance to object function. To enable the perception and manipulation of articulated objects, this thesis develops algorithms for perceiving the kinematic structure and shape of objects. The resulting perceptual capabilities are used within a relational reinforcement learning framework, enabling a robot to obtain general domain knowledge for manipulation. This composition enables our robot to reliably and efficiently manipulate novel articulated objects. To verify the effectiveness of the proposed robotic system, simulated and real-world experiments were conducted with a variety of everyday objects. 2011-01-01T08:00:00Z text https://scholarworks.umass.edu/dissertations/AAI3482710 Doctoral Dissertations Available from Proquest ENG ScholarWorks@UMass Amherst Robotics|Artificial intelligence|Computer science
collection NDLTD
language ENG
sources NDLTD
topic Robotics|Artificial intelligence|Computer science
spellingShingle Robotics|Artificial intelligence|Computer science
Katz, Dov
Interactive perception of articulated objects for autonomous manipulation
description This thesis develops robotic skills for manipulating novel articulated objects. The degrees of freedom of an articulated object describe the relationship among its rigid bodies, and are often relevant to the object's intended function. Examples of everyday articulated objects include scissors, pliers, doors, door handles, books, and drawers. Autonomous manipulation of articulated objects is therefore a prerequisite for many robotic applications in our everyday environments. Already today, robots perform complex manipulation tasks, with impressive accuracy and speed, in controlled environments such as factory floors. An important characteristic of these environments is that they can be engineered to reduce or even eliminate perception. In contrast, in unstructured environments such as our homes and offices, perception is typically much more challenging. Indeed, manipulation in these unstructured environments remains largely unsolved. We therefore assume that to enable autonomous manipulation of objects in our everyday environments, robots must be able to acquire information about these objects, making as few assumption about the environment as possible. Acquiring information about the world from sensor data is a challenging problem. Because there is so much information that could be measured about the environment, considering all of it is impractical given current computational speeds. Instead, we propose to leverage our understanding of the task, in order to determine the relevant information. In our case, this information consists of the object's shape and kinematic structure. Perceiving this task-specific information is still challenging. This is because in order to understand the object's degrees of freedom, we must observe relative motion between its rigid bodies. And, as relative motion is not guaranteed to occur, this information may not be included in the sensor stream. The main contribution of this thesis is the design and implementation of a robotic system capable of perceiving and manipulating articulated objects. This system relies on Interactive Perception, an approach which exploits the synergies that arise when crossing the boundary between action and perception. In interactive perception, the emphasis of perception shifts from object appearance to object function. To enable the perception and manipulation of articulated objects, this thesis develops algorithms for perceiving the kinematic structure and shape of objects. The resulting perceptual capabilities are used within a relational reinforcement learning framework, enabling a robot to obtain general domain knowledge for manipulation. This composition enables our robot to reliably and efficiently manipulate novel articulated objects. To verify the effectiveness of the proposed robotic system, simulated and real-world experiments were conducted with a variety of everyday objects.
author Katz, Dov
author_facet Katz, Dov
author_sort Katz, Dov
title Interactive perception of articulated objects for autonomous manipulation
title_short Interactive perception of articulated objects for autonomous manipulation
title_full Interactive perception of articulated objects for autonomous manipulation
title_fullStr Interactive perception of articulated objects for autonomous manipulation
title_full_unstemmed Interactive perception of articulated objects for autonomous manipulation
title_sort interactive perception of articulated objects for autonomous manipulation
publisher ScholarWorks@UMass Amherst
publishDate 2011
url https://scholarworks.umass.edu/dissertations/AAI3482710
work_keys_str_mv AT katzdov interactiveperceptionofarticulatedobjectsforautonomousmanipulation
_version_ 1719364468240023552