A Fast Feature Points-Based Object Tracking Method for Robot Grasp

In this paper, we propose a fast feature points-based object tracking method for robot grasp. In the detection phase, we detect the object with SIFT feature points extraction and matching. Then we compute the object's image position with homography constraints and set up an interest window to a...

Full description

Bibliographic Details
Main Authors: Yang Yang, Qixin Cao
Format: Article
Language:English
Published: SAGE Publishing 2013-03-01
Series:International Journal of Advanced Robotic Systems
Online Access:https://doi.org/10.5772/55951
id doaj-fac167aa63bb4437823f2f8eb29672ba
record_format Article
spelling doaj-fac167aa63bb4437823f2f8eb29672ba2020-11-25T03:24:07ZengSAGE PublishingInternational Journal of Advanced Robotic Systems1729-88142013-03-011010.5772/5595110.5772_55951A Fast Feature Points-Based Object Tracking Method for Robot GraspYang Yang0Qixin Cao1 Research Institute of Robotics, Shanghai Jiao Tong University, Shanghai, China Research Institute of Robotics, Shanghai Jiao Tong University, Shanghai, ChinaIn this paper, we propose a fast feature points-based object tracking method for robot grasp. In the detection phase, we detect the object with SIFT feature points extraction and matching. Then we compute the object's image position with homography constraints and set up an interest window to accommodate the object. In the tracking phase, we only focus on the interest window, detecting feature points from the window and updating the window's position and size. Our method is of special practical meaning in the case of service robot grasp. Because when the robot grasps the object, the object's image size is usually small relative to the whole image, it is unnecessary to detect the whole image. On the other hand, the object is partially occluded by the robot gripper. SIFT is good at dealing with occlusion, but it is time consuming. Hence, by combining SIFT and an interest window, our method gains the ability to deal with occlusion and can satisfy the real-time requirements at the same time. Experiments show that our method exceeds several leading feature points-based object tracking methods in real-time performance.https://doi.org/10.5772/55951
collection DOAJ
language English
format Article
sources DOAJ
author Yang Yang
Qixin Cao
spellingShingle Yang Yang
Qixin Cao
A Fast Feature Points-Based Object Tracking Method for Robot Grasp
International Journal of Advanced Robotic Systems
author_facet Yang Yang
Qixin Cao
author_sort Yang Yang
title A Fast Feature Points-Based Object Tracking Method for Robot Grasp
title_short A Fast Feature Points-Based Object Tracking Method for Robot Grasp
title_full A Fast Feature Points-Based Object Tracking Method for Robot Grasp
title_fullStr A Fast Feature Points-Based Object Tracking Method for Robot Grasp
title_full_unstemmed A Fast Feature Points-Based Object Tracking Method for Robot Grasp
title_sort fast feature points-based object tracking method for robot grasp
publisher SAGE Publishing
series International Journal of Advanced Robotic Systems
issn 1729-8814
publishDate 2013-03-01
description In this paper, we propose a fast feature points-based object tracking method for robot grasp. In the detection phase, we detect the object with SIFT feature points extraction and matching. Then we compute the object's image position with homography constraints and set up an interest window to accommodate the object. In the tracking phase, we only focus on the interest window, detecting feature points from the window and updating the window's position and size. Our method is of special practical meaning in the case of service robot grasp. Because when the robot grasps the object, the object's image size is usually small relative to the whole image, it is unnecessary to detect the whole image. On the other hand, the object is partially occluded by the robot gripper. SIFT is good at dealing with occlusion, but it is time consuming. Hence, by combining SIFT and an interest window, our method gains the ability to deal with occlusion and can satisfy the real-time requirements at the same time. Experiments show that our method exceeds several leading feature points-based object tracking methods in real-time performance.
url https://doi.org/10.5772/55951
work_keys_str_mv AT yangyang afastfeaturepointsbasedobjecttrackingmethodforrobotgrasp
AT qixincao afastfeaturepointsbasedobjecttrackingmethodforrobotgrasp
AT yangyang fastfeaturepointsbasedobjecttrackingmethodforrobotgrasp
AT qixincao fastfeaturepointsbasedobjecttrackingmethodforrobotgrasp
_version_ 1724603278723907584