Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions

Listening to one another is essential to human-human interaction. In fact, we humans spend a substantial part of our day listening to other people, in private as well as in work settings. Attentive listening serves the function to gather information for oneself, but at the same time, it also signals...

Full description

Bibliographic Details
Main Authors: Catharine Oertel, Patrik Jonell, Dimosthenis Kontogiorgos, Kenneth Funes Mora, Jean-Marc Odobez, Joakim Gustafson
Format: Article
Language:English
Published: Frontiers Media S.A. 2021-07-01
Series:Frontiers in Robotics and AI
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frobt.2021.555913/full
id doaj-1b62407ab2c4467fa268ec7c7a10af4a
record_format Article
spelling doaj-1b62407ab2c4467fa268ec7c7a10af4a2021-07-01T05:06:17ZengFrontiers Media S.A.Frontiers in Robotics and AI2296-91442021-07-01810.3389/frobt.2021.555913555913Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party InteractionsCatharine Oertel0Patrik Jonell1Dimosthenis Kontogiorgos2Kenneth Funes Mora3Jean-Marc Odobez4Joakim Gustafson5Department of Intelligent Systems, Interactive Intelligence, Delft University of Technology, Delft, NetherlandsDepartment of Intelligent Systems, Division of speech music and hearing, KTH Royal Institute of Technology, Stockholm, SwedenDepartment of Intelligent Systems, Division of speech music and hearing, KTH Royal Institute of Technology, Stockholm, SwedenEyeware Tech SA, Martigny, SwitzerlandPerception and Activity Understanding, Idiap Research Institute, Martigny, SwitzerlandDepartment of Intelligent Systems, Division of speech music and hearing, KTH Royal Institute of Technology, Stockholm, SwedenListening to one another is essential to human-human interaction. In fact, we humans spend a substantial part of our day listening to other people, in private as well as in work settings. Attentive listening serves the function to gather information for oneself, but at the same time, it also signals to the speaker that he/she is being heard. To deduce whether our interlocutor is listening to us, we are relying on reading his/her nonverbal cues, very much like how we also use non-verbal cues to signal our attention. Such signaling becomes more complex when we move from dyadic to multi-party interactions. Understanding how humans use nonverbal cues in a multi-party listening context not only increases our understanding of human-human communication but also aids the development of successful human-robot interactions. This paper aims to bring together previous analyses of listener behavior analyses in human-human multi-party interaction and provide novel insights into gaze patterns between the listeners in particular. We are investigating whether the gaze patterns and feedback behavior, as observed in the human-human dialogue, are also beneficial for the perception of a robot in multi-party human-robot interaction. To answer this question, we are implementing an attentive listening system that generates multi-modal listening behavior based on our human-human analysis. We are comparing our system to a baseline system that does not differentiate between different listener types in its behavior generation. We are evaluating it in terms of the participant’s perception of the robot, his behavior as well as the perception of third-party observers.https://www.frontiersin.org/articles/10.3389/frobt.2021.555913/fullmulti-party interactionsnon-verbal behaviorseye-gaze patternshead gestureshuman-robot interactionartificial listener
collection DOAJ
language English
format Article
sources DOAJ
author Catharine Oertel
Patrik Jonell
Dimosthenis Kontogiorgos
Kenneth Funes Mora
Jean-Marc Odobez
Joakim Gustafson
spellingShingle Catharine Oertel
Patrik Jonell
Dimosthenis Kontogiorgos
Kenneth Funes Mora
Jean-Marc Odobez
Joakim Gustafson
Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
Frontiers in Robotics and AI
multi-party interactions
non-verbal behaviors
eye-gaze patterns
head gestures
human-robot interaction
artificial listener
author_facet Catharine Oertel
Patrik Jonell
Dimosthenis Kontogiorgos
Kenneth Funes Mora
Jean-Marc Odobez
Joakim Gustafson
author_sort Catharine Oertel
title Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
title_short Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
title_full Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
title_fullStr Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
title_full_unstemmed Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
title_sort towards an engagement-aware attentive artificial listener for multi-party interactions
publisher Frontiers Media S.A.
series Frontiers in Robotics and AI
issn 2296-9144
publishDate 2021-07-01
description Listening to one another is essential to human-human interaction. In fact, we humans spend a substantial part of our day listening to other people, in private as well as in work settings. Attentive listening serves the function to gather information for oneself, but at the same time, it also signals to the speaker that he/she is being heard. To deduce whether our interlocutor is listening to us, we are relying on reading his/her nonverbal cues, very much like how we also use non-verbal cues to signal our attention. Such signaling becomes more complex when we move from dyadic to multi-party interactions. Understanding how humans use nonverbal cues in a multi-party listening context not only increases our understanding of human-human communication but also aids the development of successful human-robot interactions. This paper aims to bring together previous analyses of listener behavior analyses in human-human multi-party interaction and provide novel insights into gaze patterns between the listeners in particular. We are investigating whether the gaze patterns and feedback behavior, as observed in the human-human dialogue, are also beneficial for the perception of a robot in multi-party human-robot interaction. To answer this question, we are implementing an attentive listening system that generates multi-modal listening behavior based on our human-human analysis. We are comparing our system to a baseline system that does not differentiate between different listener types in its behavior generation. We are evaluating it in terms of the participant’s perception of the robot, his behavior as well as the perception of third-party observers.
topic multi-party interactions
non-verbal behaviors
eye-gaze patterns
head gestures
human-robot interaction
artificial listener
url https://www.frontiersin.org/articles/10.3389/frobt.2021.555913/full
work_keys_str_mv AT catharineoertel towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
AT patrikjonell towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
AT dimostheniskontogiorgos towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
AT kennethfunesmora towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
AT jeanmarcodobez towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
AT joakimgustafson towardsanengagementawareattentiveartificiallistenerformultipartyinteractions
_version_ 1721347241324576768