Control of Articulated Robot Arm by Eye Tracking

Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of human eyes as an alternative of hands are an innovative way in the human computer interaction perspective. Many application of autonomous robot control has already been developed, but we developed tw...

Full description

Bibliographic Details
Main Authors: Shahzad, Muhammad Imran, Mehmood, Saqib
Format: Others
Language:English
Published: Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation 2010
Subjects:
Online Access:http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3096
id ndltd-UPSALLA1-oai-DiVA.org-bth-3096
record_format oai_dc
spelling ndltd-UPSALLA1-oai-DiVA.org-bth-30962018-01-12T05:14:15ZControl of Articulated Robot Arm by Eye TrackingengShahzad, Muhammad ImranMehmood, SaqibBlekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikationBlekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation2010EyetrackingInterfaceArticulated robotComputer SciencesDatavetenskap (datalogi)Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of human eyes as an alternative of hands are an innovative way in the human computer interaction perspective. Many application of autonomous robot control has already been developed, but we developed two different interfaces to control the articulated robot manually. The first of these interfaces is controlled by mouse and the second is controlled by eye tracking. Main focus of our thesis is to facilitate the people with motor disabilities by using their eye as an input instead of a mouse. Eye gaze tracking technique is used to send commands to perform different tasks. Interfaces are divided into different active and inactive regions. Dwell time is a well known technique which is used to execute commands through eye gaze instead of using a mouse. When a user gazes in an active region for a specific dwell time, the command is executed and the robot performs a specific task. When inactive regions are gazed at, there no command execution and no function are performed. The difference between time of performing the task by mouse and Eyetracking is shown to be 40 ms, the mouse being faster. However, a mouse cannot be used for people with motor disabilities, so the Eyetracker in this case has a decisive advantage. Keywords: Eyetracking, Interface, Articulated robot Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:bth-3096Local oai:bth.se:arkivex903504DAE6D4315CC125781C00552554application/pdfinfo:eu-repo/semantics/openAccess
collection NDLTD
language English
format Others
sources NDLTD
topic Eyetracking
Interface
Articulated robot
Computer Sciences
Datavetenskap (datalogi)
spellingShingle Eyetracking
Interface
Articulated robot
Computer Sciences
Datavetenskap (datalogi)
Shahzad, Muhammad Imran
Mehmood, Saqib
Control of Articulated Robot Arm by Eye Tracking
description Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of human eyes as an alternative of hands are an innovative way in the human computer interaction perspective. Many application of autonomous robot control has already been developed, but we developed two different interfaces to control the articulated robot manually. The first of these interfaces is controlled by mouse and the second is controlled by eye tracking. Main focus of our thesis is to facilitate the people with motor disabilities by using their eye as an input instead of a mouse. Eye gaze tracking technique is used to send commands to perform different tasks. Interfaces are divided into different active and inactive regions. Dwell time is a well known technique which is used to execute commands through eye gaze instead of using a mouse. When a user gazes in an active region for a specific dwell time, the command is executed and the robot performs a specific task. When inactive regions are gazed at, there no command execution and no function are performed. The difference between time of performing the task by mouse and Eyetracking is shown to be 40 ms, the mouse being faster. However, a mouse cannot be used for people with motor disabilities, so the Eyetracker in this case has a decisive advantage. Keywords: Eyetracking, Interface, Articulated robot
author Shahzad, Muhammad Imran
Mehmood, Saqib
author_facet Shahzad, Muhammad Imran
Mehmood, Saqib
author_sort Shahzad, Muhammad Imran
title Control of Articulated Robot Arm by Eye Tracking
title_short Control of Articulated Robot Arm by Eye Tracking
title_full Control of Articulated Robot Arm by Eye Tracking
title_fullStr Control of Articulated Robot Arm by Eye Tracking
title_full_unstemmed Control of Articulated Robot Arm by Eye Tracking
title_sort control of articulated robot arm by eye tracking
publisher Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation
publishDate 2010
url http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3096
work_keys_str_mv AT shahzadmuhammadimran controlofarticulatedrobotarmbyeyetracking
AT mehmoodsaqib controlofarticulatedrobotarmbyeyetracking
_version_ 1718606783842353152