Perception of Emotion from Facial Expression and Affective Prosody

Real-world perception of emotion results from the integration of multiple cues, most notably facial expression and affective prosody. The use of incongruent emotional stimuli presents an opportunity to study the interaction between sensory modalities. Thirty-seven participants were exposed to audio-...

Full description

Bibliographic Details
Main Author: Santorelli, Noelle Turini
Format: Others
Published: Digital Archive @ GSU 2006
Subjects:
Online Access:http://digitalarchive.gsu.edu/psych_theses/17
http://digitalarchive.gsu.edu/cgi/viewcontent.cgi?article=1016&context=psych_theses
Description
Summary:Real-world perception of emotion results from the integration of multiple cues, most notably facial expression and affective prosody. The use of incongruent emotional stimuli presents an opportunity to study the interaction between sensory modalities. Thirty-seven participants were exposed to audio-visual stimuli (Robins & Schultz, 2004) including angry, fearful, happy, and neutral presentations. Eighty stimuli contain matching emotions and 240 contain incongruent emotional cues. Matching emotions elicited a significant number of correct responses for all four emotions. Sign tests indicated that for most incongruent conditions, participants demonstrated a bias towards the visual modality. Despite these findings, specific incongruent conditions did show evidence of blending. Future research should explore an evolutionary model of facial expression as a means for behavioral adaptation and the possibility of an “emotional McGurk effect” in particular combinations of emotions.