Coordinating the eyes and hand in goal-directed movement sequences

Coordinated gaze and hand movements predominate a number of our interactions in reachable space and yet few studies examine the potential contribution of tactile feedback in planning these actions. This thesis was designed to investigate eye and hand coordination during movement sequences when reach...

Full description

Bibliographic Details
Main Author: Bowman, MILES
Other Authors: Queen's University (Kingston, Ont.). Theses (Queen's University (Kingston, Ont.))
Format: Others
Language:en
en
Published: 2009
Subjects:
Online Access:http://hdl.handle.net/1974/5317
id ndltd-LACETR-oai-collectionscanada.gc.ca-OKQ.1974-5317
record_format oai_dc
spelling ndltd-LACETR-oai-collectionscanada.gc.ca-OKQ.1974-53172013-12-20T03:39:29ZCoordinating the eyes and hand in goal-directed movement sequencesBowman, MILESeye hand coordinationmotor controlCoordinated gaze and hand movements predominate a number of our interactions in reachable space and yet few studies examine the potential contribution of tactile feedback in planning these actions. This thesis was designed to investigate eye and hand coordination during movement sequences when reaching out to interact with objects. We developed a virtual reality paradigm that allowed us to control visual, tactile, and in some cases, auditory feedback provided to participants. Participants reached and touched five objects in succession. We measured behaviour that resulted from removing one or more of the aforementioned sources of feedback – focusing on task accuracy, and the timing and dynamics of eye and hand movements. Our principle manipulations were to remove visual feedback of the hand, and/or to change the object response to contact. We also unexpectedly removed tactile feedback signaling contact. In Experiment 1, we examined gaze and hand movement timing relative to contact events. Gaze remained long enough to capture contact in central vision, but also followed a time course indicating that contact timing was predicted. In Experiment 2 we examined the influence of dynamic object consequences (i.e., motion). Gaze remained to monitor consequences that follow initial contact especially when the hand was invisible; with longer delays it became difficult to differentiate between predictive or reactive movements. In Experiment 3 we directly tested whether gaze would hold upon a site of action during prolonged manipulation. Here, gaze remained past contact time and instead its departure was associated with the completion of action. Our findings are congruent with the notion that visually guided reaches are controlled to facilitate directing the hand to viewed locations of action – without visual feedback of the hand accuracy diminished and hand approach changed across all experiments. However, we provide consistent evidence that gaze is also controlled to capture planned sensory consequences related to action at its viewed location. Monitoring these sites would facilitate comparing predicted sensory events with those that are actively measured and improve control throughout the movement sequence. Such a process also indicates the importance of considering tactile feedback when examining coordinated eye and hand movements.Thesis (Ph.D, Neuroscience Studies) -- Queen's University, 2009-11-13 16:12:30.086Queen's University (Kingston, Ont.). Theses (Queen's University (Kingston, Ont.))2009-11-13 15:29:46.7712009-11-13 16:12:30.0862009-11-13T21:16:05Z2009-11-13T21:16:05Z2009-11-13T21:16:05ZThesis2072540 bytesapplication/pdfhttp://hdl.handle.net/1974/5317enenCanadian thesesThis publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.
collection NDLTD
language en
en
format Others
sources NDLTD
topic eye hand coordination
motor control
spellingShingle eye hand coordination
motor control
Bowman, MILES
Coordinating the eyes and hand in goal-directed movement sequences
description Coordinated gaze and hand movements predominate a number of our interactions in reachable space and yet few studies examine the potential contribution of tactile feedback in planning these actions. This thesis was designed to investigate eye and hand coordination during movement sequences when reaching out to interact with objects. We developed a virtual reality paradigm that allowed us to control visual, tactile, and in some cases, auditory feedback provided to participants. Participants reached and touched five objects in succession. We measured behaviour that resulted from removing one or more of the aforementioned sources of feedback – focusing on task accuracy, and the timing and dynamics of eye and hand movements. Our principle manipulations were to remove visual feedback of the hand, and/or to change the object response to contact. We also unexpectedly removed tactile feedback signaling contact. In Experiment 1, we examined gaze and hand movement timing relative to contact events. Gaze remained long enough to capture contact in central vision, but also followed a time course indicating that contact timing was predicted. In Experiment 2 we examined the influence of dynamic object consequences (i.e., motion). Gaze remained to monitor consequences that follow initial contact especially when the hand was invisible; with longer delays it became difficult to differentiate between predictive or reactive movements. In Experiment 3 we directly tested whether gaze would hold upon a site of action during prolonged manipulation. Here, gaze remained past contact time and instead its departure was associated with the completion of action. Our findings are congruent with the notion that visually guided reaches are controlled to facilitate directing the hand to viewed locations of action – without visual feedback of the hand accuracy diminished and hand approach changed across all experiments. However, we provide consistent evidence that gaze is also controlled to capture planned sensory consequences related to action at its viewed location. Monitoring these sites would facilitate comparing predicted sensory events with those that are actively measured and improve control throughout the movement sequence. Such a process also indicates the importance of considering tactile feedback when examining coordinated eye and hand movements. === Thesis (Ph.D, Neuroscience Studies) -- Queen's University, 2009-11-13 16:12:30.086
author2 Queen's University (Kingston, Ont.). Theses (Queen's University (Kingston, Ont.))
author_facet Queen's University (Kingston, Ont.). Theses (Queen's University (Kingston, Ont.))
Bowman, MILES
author Bowman, MILES
author_sort Bowman, MILES
title Coordinating the eyes and hand in goal-directed movement sequences
title_short Coordinating the eyes and hand in goal-directed movement sequences
title_full Coordinating the eyes and hand in goal-directed movement sequences
title_fullStr Coordinating the eyes and hand in goal-directed movement sequences
title_full_unstemmed Coordinating the eyes and hand in goal-directed movement sequences
title_sort coordinating the eyes and hand in goal-directed movement sequences
publishDate 2009
url http://hdl.handle.net/1974/5317
work_keys_str_mv AT bowmanmiles coordinatingtheeyesandhandingoaldirectedmovementsequences
_version_ 1716621089700839424