Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods
In this work, we present several heuristic-based and data-driven active vision strategies for viewpoint optimization of an arm-mounted depth camera to aid robotic grasping. These strategies aim to efficiently collect data to boost the performance of an underlying grasp synthesis algorithm. We create...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-07-01
|
Series: | Frontiers in Robotics and AI |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/frobt.2021.696587/full |
id |
doaj-9849ba6c068446789acfb4f0e381f6d0 |
---|---|
record_format |
Article |
spelling |
doaj-9849ba6c068446789acfb4f0e381f6d02021-07-15T14:44:09ZengFrontiers Media S.A.Frontiers in Robotics and AI2296-91442021-07-01810.3389/frobt.2021.696587696587Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision MethodsSabhari Natarajan0Galen Brown1Berk Calli2Berk Calli3Manipulation and Environmental Robotics Laboratory (MER Lab), Robotics Engineering Department, Worcester Polytechnic Institute, Worcester, MA, United StatesManipulation and Environmental Robotics Laboratory (MER Lab), Computer Science Department, Worcester Polytechnic Institute, Worcester, MA, United StatesManipulation and Environmental Robotics Laboratory (MER Lab), Robotics Engineering Department, Worcester Polytechnic Institute, Worcester, MA, United StatesManipulation and Environmental Robotics Laboratory (MER Lab), Computer Science Department, Worcester Polytechnic Institute, Worcester, MA, United StatesIn this work, we present several heuristic-based and data-driven active vision strategies for viewpoint optimization of an arm-mounted depth camera to aid robotic grasping. These strategies aim to efficiently collect data to boost the performance of an underlying grasp synthesis algorithm. We created an open-source benchmarking platform in simulation (https://github.com/galenbr/2021ActiveVision), and provide an extensive study for assessing the performance of the proposed methods as well as comparing them against various baseline strategies. We also provide an experimental study with a real-world two finger parallel jaw gripper setup by utilizing an existing grasp planning benchmark in the literature. With these analyses, we were able to quantitatively demonstrate the versatility of heuristic methods that prioritize certain types of exploration, and qualitatively show their robustness to both novel objects and the transition from simulation to the real world. We identified scenarios in which our methods did not perform well and objectively difficult scenarios, and present a discussion on which avenues for future research show promise.https://www.frontiersin.org/articles/10.3389/frobt.2021.696587/fullactive visiongrasp synthesisreinforcement learningself-supervised learningbenchmarking |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Sabhari Natarajan Galen Brown Berk Calli Berk Calli |
spellingShingle |
Sabhari Natarajan Galen Brown Berk Calli Berk Calli Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods Frontiers in Robotics and AI active vision grasp synthesis reinforcement learning self-supervised learning benchmarking |
author_facet |
Sabhari Natarajan Galen Brown Berk Calli Berk Calli |
author_sort |
Sabhari Natarajan |
title |
Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods |
title_short |
Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods |
title_full |
Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods |
title_fullStr |
Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods |
title_full_unstemmed |
Aiding Grasp Synthesis for Novel Objects Using Heuristic-Based and Data-Driven Active Vision Methods |
title_sort |
aiding grasp synthesis for novel objects using heuristic-based and data-driven active vision methods |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Robotics and AI |
issn |
2296-9144 |
publishDate |
2021-07-01 |
description |
In this work, we present several heuristic-based and data-driven active vision strategies for viewpoint optimization of an arm-mounted depth camera to aid robotic grasping. These strategies aim to efficiently collect data to boost the performance of an underlying grasp synthesis algorithm. We created an open-source benchmarking platform in simulation (https://github.com/galenbr/2021ActiveVision), and provide an extensive study for assessing the performance of the proposed methods as well as comparing them against various baseline strategies. We also provide an experimental study with a real-world two finger parallel jaw gripper setup by utilizing an existing grasp planning benchmark in the literature. With these analyses, we were able to quantitatively demonstrate the versatility of heuristic methods that prioritize certain types of exploration, and qualitatively show their robustness to both novel objects and the transition from simulation to the real world. We identified scenarios in which our methods did not perform well and objectively difficult scenarios, and present a discussion on which avenues for future research show promise. |
topic |
active vision grasp synthesis reinforcement learning self-supervised learning benchmarking |
url |
https://www.frontiersin.org/articles/10.3389/frobt.2021.696587/full |
work_keys_str_mv |
AT sabharinatarajan aidinggraspsynthesisfornovelobjectsusingheuristicbasedanddatadrivenactivevisionmethods AT galenbrown aidinggraspsynthesisfornovelobjectsusingheuristicbasedanddatadrivenactivevisionmethods AT berkcalli aidinggraspsynthesisfornovelobjectsusingheuristicbasedanddatadrivenactivevisionmethods AT berkcalli aidinggraspsynthesisfornovelobjectsusingheuristicbasedanddatadrivenactivevisionmethods |
_version_ |
1721300156810264576 |