MACH: My Automated Conversation coacH

MACH--My Automated Conversation coacH--is a novel system that provides ubiquitous access to social skills training. The system includes a virtual agent that reads facial expressions, speech, and prosody and responds with verbal and nonverbal behaviors in real time. This paper presents an application...

Full description

Bibliographic Details
Main Authors: Courgeon, Matthieu (Author), Martin, Jean-Claude (Author), Mutlu, Bilge (Author), Picard, Rosalind W. (Contributor), Hoque, Mohammed Ehasanul (Contributor)
Other Authors: Massachusetts Institute of Technology. Media Laboratory (Contributor), Program in Media Arts and Sciences (Massachusetts Institute of Technology) (Contributor)
Format: Article
Language:English
Published: Association for Computing Machinery (ACM), 2014-12-22T18:43:58Z.
Subjects:
Online Access:Get fulltext
LEADER 02243 am a22002653u 4500
001 92442
042 |a dc 
100 1 0 |a Courgeon, Matthieu  |e author 
100 1 0 |a Massachusetts Institute of Technology. Media Laboratory  |e contributor 
100 1 0 |a Program in Media Arts and Sciences   |q  (Massachusetts Institute of Technology)   |e contributor 
100 1 0 |a Hoque, Mohammed Ehasanul  |e contributor 
100 1 0 |a Picard, Rosalind W.  |e contributor 
700 1 0 |a Martin, Jean-Claude  |e author 
700 1 0 |a Mutlu, Bilge  |e author 
700 1 0 |a Picard, Rosalind W.  |e author 
700 1 0 |a Hoque, Mohammed Ehasanul  |e author 
245 0 0 |a MACH: My Automated Conversation coacH 
260 |b Association for Computing Machinery (ACM),   |c 2014-12-22T18:43:58Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/92442 
520 |a MACH--My Automated Conversation coacH--is a novel system that provides ubiquitous access to social skills training. The system includes a virtual agent that reads facial expressions, speech, and prosody and responds with verbal and nonverbal behaviors in real time. This paper presents an application of MACH in the context of training for job interviews. During the training, MACH asks interview questions, automatically mimics certain behavior issued by the user, and exhibit appropriate nonverbal behaviors. Following the interaction, MACH provides visual feedback on the user's performance. The development of this application draws on data from 28 interview sessions, involving employment-seeking students and career counselors. The effectiveness of MACH was assessed through a weeklong trial with 90 MIT undergraduates. Students who interacted with MACH were rated by human experts to have improved in overall interview performance, while the ratings of students in control groups did not improve. Post-experiment interviews indicate that participants found the interview experience informative about their behaviors and expressed interest in using MACH in the future. 
520 |a Samsung (Firm) 
520 |a MIT Media Lab Consortium 
546 |a en_US 
655 7 |a Article 
773 |t Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing (UbiComp '13)