Eye-gaze information input based on pupillary response to visual stimulus with luminance modulation.

This study develops an information-input interface in which a visual stimulus targeted by a user's eye gaze is identified based on the pupillary light reflex to periodic luminance modulations of the object. Experiment 1 examines how pupil size changes in response to periodic luminance modulatio...

Full description

Bibliographic Details
Main Authors: Yumiko Muto, Hideka Miyoshi, Hirohiko Kaneko
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2020-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0226991
Description
Summary:This study develops an information-input interface in which a visual stimulus targeted by a user's eye gaze is identified based on the pupillary light reflex to periodic luminance modulations of the object. Experiment 1 examines how pupil size changes in response to periodic luminance modulation of visual stimuli, and the results are used to develop an algorithm for information input. Experiment 2a examines the effectiveness of interfaces with two objects. The results demonstrate that 98% accurate identification of the gaze targeted object is possible if the luminance modulation frequencies of two objects differ by at least 0.12 Hz. Experiment 2b examines the accuracy of a gaze directed information input method based on a keyboard configuration with twelve responses. The results reveal that keyboard input is possible with an average accuracy of 85% for luminance modulation frequencies from 0.75 to 2.75 Hz. The proposed pupillometry based information-input interface offers several advantages, such as low burden on users, minimal invasiveness, no need for training or experience, high theoretical validity, and no need for calibration. Thus, the pupillometry method presented herein has advantages for practical use without requiring the eye's position to be calibrated. Additionally, this method has a potential for the design of interfaces that allow patients with severely limited motor function to communicate with others.
ISSN:1932-6203