Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback

How humans visually select where to grasp objects is determined by the physical object properties (e.g., size, shape, weight), the degrees of freedom of the arm and hand, as well as the task to be performed. We recently demonstrated that human grasps are near-optimal with respect to a weighted combi...

Full description

Bibliographic Details
Main Authors: Guido Maiello, Marcel Schepko, Lina K. Klein, Vivian C. Paulun, Roland W. Fleming
Format: Article
Language:English
Published: Frontiers Media S.A. 2021-01-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2020.591898/full
id doaj-b4950874db1c40ef8a367da2473d1127
record_format Article
spelling doaj-b4950874db1c40ef8a367da2473d11272021-01-12T05:45:19ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2021-01-011410.3389/fnins.2020.591898591898Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic FeedbackGuido Maiello0Marcel Schepko1Lina K. Klein2Vivian C. Paulun3Roland W. Fleming4Roland W. Fleming5Department of Experimental Psychology, Justus Liebig University Giessen, Giessen, GermanyDepartment of Experimental Psychology, Justus Liebig University Giessen, Giessen, GermanyDepartment of Experimental Psychology, Justus Liebig University Giessen, Giessen, GermanyDepartment of Experimental Psychology, Justus Liebig University Giessen, Giessen, GermanyDepartment of Experimental Psychology, Justus Liebig University Giessen, Giessen, GermanyCenter for Mind, Brain and Behavior, Justus Liebig University Giessen, Giessen, GermanyHow humans visually select where to grasp objects is determined by the physical object properties (e.g., size, shape, weight), the degrees of freedom of the arm and hand, as well as the task to be performed. We recently demonstrated that human grasps are near-optimal with respect to a weighted combination of different cost functions that make grasps uncomfortable, unstable, or impossible, e.g., due to unnatural grasp apertures or large torques. Here, we ask whether humans can consciously access these rules. We test if humans can explicitly judge grasp quality derived from rules regarding grasp size, orientation, torque, and visibility. More specifically, we test if grasp quality can be inferred (i) by using visual cues and motor imagery alone, (ii) from watching grasps executed by others, and (iii) through performing grasps, i.e., receiving visual, proprioceptive and haptic feedback. Stimuli were novel objects made of 10 cubes of brass and wood (side length 2.5 cm) in various configurations. On each object, one near-optimal and one sub-optimal grasp were selected based on one cost function (e.g., torque), while the other constraints (grasp size, orientation, and visibility) were kept approximately constant or counterbalanced. Participants were visually cued to the location of the selected grasps on each object and verbally reported which of the two grasps was best. Across three experiments, participants were required to either (i) passively view the static objects and imagine executing the two competing grasps, (ii) passively view videos of other participants grasping the objects, or (iii) actively grasp the objects themselves. Our results show that, for a majority of tested objects, participants could already judge grasp optimality from simply viewing the objects and imagining to grasp them, but were significantly better in the video and grasping session. These findings suggest that humans can determine grasp quality even without performing the grasp—perhaps through motor imagery—and can further refine their understanding of how to correctly grasp an object through sensorimotor feedback but also by passively viewing others grasp objects.https://www.frontiersin.org/articles/10.3389/fnins.2020.591898/fullgraspingvisual grasp selectionprecision gripshapematerialmotor imagery
collection DOAJ
language English
format Article
sources DOAJ
author Guido Maiello
Marcel Schepko
Lina K. Klein
Vivian C. Paulun
Roland W. Fleming
Roland W. Fleming
spellingShingle Guido Maiello
Marcel Schepko
Lina K. Klein
Vivian C. Paulun
Roland W. Fleming
Roland W. Fleming
Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
Frontiers in Neuroscience
grasping
visual grasp selection
precision grip
shape
material
motor imagery
author_facet Guido Maiello
Marcel Schepko
Lina K. Klein
Vivian C. Paulun
Roland W. Fleming
Roland W. Fleming
author_sort Guido Maiello
title Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
title_short Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
title_full Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
title_fullStr Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
title_full_unstemmed Humans Can Visually Judge Grasp Quality and Refine Their Judgments Through Visual and Haptic Feedback
title_sort humans can visually judge grasp quality and refine their judgments through visual and haptic feedback
publisher Frontiers Media S.A.
series Frontiers in Neuroscience
issn 1662-453X
publishDate 2021-01-01
description How humans visually select where to grasp objects is determined by the physical object properties (e.g., size, shape, weight), the degrees of freedom of the arm and hand, as well as the task to be performed. We recently demonstrated that human grasps are near-optimal with respect to a weighted combination of different cost functions that make grasps uncomfortable, unstable, or impossible, e.g., due to unnatural grasp apertures or large torques. Here, we ask whether humans can consciously access these rules. We test if humans can explicitly judge grasp quality derived from rules regarding grasp size, orientation, torque, and visibility. More specifically, we test if grasp quality can be inferred (i) by using visual cues and motor imagery alone, (ii) from watching grasps executed by others, and (iii) through performing grasps, i.e., receiving visual, proprioceptive and haptic feedback. Stimuli were novel objects made of 10 cubes of brass and wood (side length 2.5 cm) in various configurations. On each object, one near-optimal and one sub-optimal grasp were selected based on one cost function (e.g., torque), while the other constraints (grasp size, orientation, and visibility) were kept approximately constant or counterbalanced. Participants were visually cued to the location of the selected grasps on each object and verbally reported which of the two grasps was best. Across three experiments, participants were required to either (i) passively view the static objects and imagine executing the two competing grasps, (ii) passively view videos of other participants grasping the objects, or (iii) actively grasp the objects themselves. Our results show that, for a majority of tested objects, participants could already judge grasp optimality from simply viewing the objects and imagining to grasp them, but were significantly better in the video and grasping session. These findings suggest that humans can determine grasp quality even without performing the grasp—perhaps through motor imagery—and can further refine their understanding of how to correctly grasp an object through sensorimotor feedback but also by passively viewing others grasp objects.
topic grasping
visual grasp selection
precision grip
shape
material
motor imagery
url https://www.frontiersin.org/articles/10.3389/fnins.2020.591898/full
work_keys_str_mv AT guidomaiello humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
AT marcelschepko humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
AT linakklein humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
AT viviancpaulun humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
AT rolandwfleming humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
AT rolandwfleming humanscanvisuallyjudgegraspqualityandrefinetheirjudgmentsthroughvisualandhapticfeedback
_version_ 1724340806075023360