A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task

Existing research has shown that human eye-movement data conveys rich information about underlying mental processes, and that the latter may be inferred from the former. However, most related studies rely on spatial information about which different areas of visual stimuli were looked at, without co...

Full description

Bibliographic Details
Main Authors: Michał Król, Magdalena Ewa Król
Format: Article
Language:English
Published: MDPI AG 2019-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/19/10/2377
id doaj-00b9fd7289264b6ca61dfdf49a919159
record_format Article
spelling doaj-00b9fd7289264b6ca61dfdf49a9191592020-11-25T01:38:41ZengMDPI AGSensors1424-82202019-05-011910237710.3390/s19102377s19102377A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing TaskMichał Król0Magdalena Ewa Król1Department of Economics, The University of Manchester, Manchester M13 9PL, UKWroclaw Faculty of Psychology, SWPS University of Social Sciences and Humanities, 53-238 Warszawa, PolandExisting research has shown that human eye-movement data conveys rich information about underlying mental processes, and that the latter may be inferred from the former. However, most related studies rely on spatial information about which different areas of visual stimuli were looked at, without considering the order in which this occurred. Although powerful algorithms for making pairwise comparisons between eye-movement sequences (scanpaths) exist, the problem is how to compare two groups of scanpaths, e.g., those registered with vs. without an experimental manipulation in place, rather than individual scanpaths. Here, we propose that the problem might be solved by projecting a scanpath similarity matrix, obtained via a pairwise comparison algorithm, to a lower-dimensional space (the comparison and dimensionality-reduction techniques we use are ScanMatch and t-SNE). The resulting distributions of low-dimensional vectors representing individual scanpaths can be statistically compared. To assess if the differences result from temporal scanpath features, we propose to statistically compare the cross-validated accuracies of two classifiers predicting group membership: (1) based exclusively on spatial metrics; (2) based additionally on the obtained scanpath representation vectors. To illustrate, we compare autistic vs. typically-developing individuals looking at human faces during a lab experiment and find significant differences in temporal scanpath features.https://www.mdpi.com/1424-8220/19/10/2377eye trackingscanpath comparisondimensionality reductionmachine learningautismface perception
collection DOAJ
language English
format Article
sources DOAJ
author Michał Król
Magdalena Ewa Król
spellingShingle Michał Król
Magdalena Ewa Król
A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
Sensors
eye tracking
scanpath comparison
dimensionality reduction
machine learning
autism
face perception
author_facet Michał Król
Magdalena Ewa Król
author_sort Michał Król
title A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
title_short A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
title_full A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
title_fullStr A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
title_full_unstemmed A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task
title_sort novel eye movement data transformation technique that preserves temporal information: a demonstration in a face processing task
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2019-05-01
description Existing research has shown that human eye-movement data conveys rich information about underlying mental processes, and that the latter may be inferred from the former. However, most related studies rely on spatial information about which different areas of visual stimuli were looked at, without considering the order in which this occurred. Although powerful algorithms for making pairwise comparisons between eye-movement sequences (scanpaths) exist, the problem is how to compare two groups of scanpaths, e.g., those registered with vs. without an experimental manipulation in place, rather than individual scanpaths. Here, we propose that the problem might be solved by projecting a scanpath similarity matrix, obtained via a pairwise comparison algorithm, to a lower-dimensional space (the comparison and dimensionality-reduction techniques we use are ScanMatch and t-SNE). The resulting distributions of low-dimensional vectors representing individual scanpaths can be statistically compared. To assess if the differences result from temporal scanpath features, we propose to statistically compare the cross-validated accuracies of two classifiers predicting group membership: (1) based exclusively on spatial metrics; (2) based additionally on the obtained scanpath representation vectors. To illustrate, we compare autistic vs. typically-developing individuals looking at human faces during a lab experiment and find significant differences in temporal scanpath features.
topic eye tracking
scanpath comparison
dimensionality reduction
machine learning
autism
face perception
url https://www.mdpi.com/1424-8220/19/10/2377
work_keys_str_mv AT michałkrol anoveleyemovementdatatransformationtechniquethatpreservestemporalinformationademonstrationinafaceprocessingtask
AT magdalenaewakrol anoveleyemovementdatatransformationtechniquethatpreservestemporalinformationademonstrationinafaceprocessingtask
AT michałkrol noveleyemovementdatatransformationtechniquethatpreservestemporalinformationademonstrationinafaceprocessingtask
AT magdalenaewakrol noveleyemovementdatatransformationtechniquethatpreservestemporalinformationademonstrationinafaceprocessingtask
_version_ 1725052142239088640