Human and robot arm control using the minimum variance principle

Many computational models of human upper limb movement successfully capture some features of human movement, but often lack a compelling biological basis. One that provides such a basis is Harris and Wolpert's minimum variance model. In this model, the variance of the hand at the end of a movem...

Full description

Bibliographic Details
Main Author: Simmons, Gavin lain
Published: Imperial College London 2008
Subjects:
612
Online Access:http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.486278
Description
Summary:Many computational models of human upper limb movement successfully capture some features of human movement, but often lack a compelling biological basis. One that provides such a basis is Harris and Wolpert's minimum variance model. In this model, the variance of the hand at the end of a movement is minimised, given that the controlling signal is sUbject to random noise with zero mean and variance proportional to the ~i���£mal'!'; . amplitude. This criterion offers a consistent explanation for several movement characteristics. This work formulates the minimum variance model into a form suitable for controlling a robot arm. This implementation allows examination of the model properties, specifically its applicability to producing human-like movement. The model is subsequently tested in areas important to studies of human movement and robotics, including reaching, grasping, and action perception. For reaching, experiments show this formulation successfully captures the characteristics of movement, supporting previous results. Reaching is initially performed between two points, but complex trajectories are also investigated through the inclusion of via-points. The addition of a gripper extends the model, allowing production of trajectories for grasping an object. Using the minimum variance principle to derive digit trajectories, a quantitative explanation for the approach of digits to the object surface is provided. These trajectories also exhibit human-like spatial and temporal coordination between hand transport and grip aperture. The model's predictive ability is further tested in the perception of human demonstrated actions. Through integration with a system that performs perception using its motor system offline, in line with the motor theory of perception, the model is shown to correlate well with data on human perception of movement. These experiments investigate and extend the explanatory and predictive use of the model for human movement, and demonstrate that it can be suitably formulated to produce human-like movement on robot arms.