Visual Saliency Prediction and Evaluation across Different Perceptual Tasks.
Saliency maps produced by different algorithms are often evaluated by comparing output to fixated image locations appearing in human eye tracking data. There are challenges in evaluation based on fixation data due to bias in the data. Properties of eye movement patterns that are independent of image...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2015-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC4569362?pdf=render |
id |
doaj-9e9c43e2df4b48d1b0182295fb370615 |
---|---|
record_format |
Article |
spelling |
doaj-9e9c43e2df4b48d1b0182295fb3706152020-11-25T02:12:58ZengPublic Library of Science (PLoS)PLoS ONE1932-62032015-01-01109e013805310.1371/journal.pone.0138053Visual Saliency Prediction and Evaluation across Different Perceptual Tasks.Shafin RahmanNeil BruceSaliency maps produced by different algorithms are often evaluated by comparing output to fixated image locations appearing in human eye tracking data. There are challenges in evaluation based on fixation data due to bias in the data. Properties of eye movement patterns that are independent of image content may limit the validity of evaluation results, including spatial bias in fixation data. To address this problem, we present modeling and evaluation results for data derived from different perceptual tasks related to the concept of saliency. We also present a novel approach to benchmarking to deal with some of the challenges posed by spatial bias. The results presented establish the value of alternatives to fixation data to drive improvement and development of models. We also demonstrate an approach to approximate the output of alternative perceptual tasks based on computational saliency and/or eye gaze data. As a whole, this work presents novel benchmarking results and methods, establishes a new performance baseline for perceptual tasks that provide an alternative window into visual saliency, and demonstrates the capacity for saliency to serve in approximating human behaviour for one visual task given data from another.http://europepmc.org/articles/PMC4569362?pdf=render |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Shafin Rahman Neil Bruce |
spellingShingle |
Shafin Rahman Neil Bruce Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. PLoS ONE |
author_facet |
Shafin Rahman Neil Bruce |
author_sort |
Shafin Rahman |
title |
Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. |
title_short |
Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. |
title_full |
Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. |
title_fullStr |
Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. |
title_full_unstemmed |
Visual Saliency Prediction and Evaluation across Different Perceptual Tasks. |
title_sort |
visual saliency prediction and evaluation across different perceptual tasks. |
publisher |
Public Library of Science (PLoS) |
series |
PLoS ONE |
issn |
1932-6203 |
publishDate |
2015-01-01 |
description |
Saliency maps produced by different algorithms are often evaluated by comparing output to fixated image locations appearing in human eye tracking data. There are challenges in evaluation based on fixation data due to bias in the data. Properties of eye movement patterns that are independent of image content may limit the validity of evaluation results, including spatial bias in fixation data. To address this problem, we present modeling and evaluation results for data derived from different perceptual tasks related to the concept of saliency. We also present a novel approach to benchmarking to deal with some of the challenges posed by spatial bias. The results presented establish the value of alternatives to fixation data to drive improvement and development of models. We also demonstrate an approach to approximate the output of alternative perceptual tasks based on computational saliency and/or eye gaze data. As a whole, this work presents novel benchmarking results and methods, establishes a new performance baseline for perceptual tasks that provide an alternative window into visual saliency, and demonstrates the capacity for saliency to serve in approximating human behaviour for one visual task given data from another. |
url |
http://europepmc.org/articles/PMC4569362?pdf=render |
work_keys_str_mv |
AT shafinrahman visualsaliencypredictionandevaluationacrossdifferentperceptualtasks AT neilbruce visualsaliencypredictionandevaluationacrossdifferentperceptualtasks |
_version_ |
1724907002133479424 |