A Common Neural Code for Perceived and Inferred Emotion

Although the emotions of other people can often be perceived from overt reactions (e.g., facial or vocal expressions), they can also be inferred from situational information in the absence of observable expressions. How does the human brain make use of these diverse forms of evidence to generate a c...

Full description

Bibliographic Details
Main Authors: Skerry, Amy E. (Contributor), Saxe, Rebecca R. (Contributor)
Other Authors: Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences (Contributor)
Format: Article
Language:English
Published: Society for Neuroscience, 2015-06-09T15:50:57Z.
Subjects:
Online Access:Get fulltext
LEADER 02530 am a22002053u 4500
001 97243
042 |a dc 
100 1 0 |a Skerry, Amy E.  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences  |e contributor 
100 1 0 |a Skerry, Amy E.  |e contributor 
100 1 0 |a Saxe, Rebecca R.  |e contributor 
700 1 0 |a Saxe, Rebecca R.  |e author 
245 0 0 |a A Common Neural Code for Perceived and Inferred Emotion 
260 |b Society for Neuroscience,   |c 2015-06-09T15:50:57Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/97243 
520 |a Although the emotions of other people can often be perceived from overt reactions (e.g., facial or vocal expressions), they can also be inferred from situational information in the absence of observable expressions. How does the human brain make use of these diverse forms of evidence to generate a common representation of a target's emotional state? In the present research, we identify neural patterns that correspond to emotions inferred from contextual information and find that these patterns generalize across different cues from which an emotion can be attributed. Specifically, we use functional neuroimaging to measure neural responses to dynamic facial expressions with positive and negative valence and to short animations in which the valence of a character's emotion could be identified only from the situation. Using multivoxel pattern analysis, we test for regions that contain information about the target's emotional state, identifying representations specific to a single stimulus type and representations that generalize across stimulus types. In regions of medial prefrontal cortex (MPFC), a classifier trained to discriminate emotional valence for one stimulus (e.g., animated situations) could successfully discriminate valence for the remaining stimulus (e.g., facial expressions), indicating a representation of valence that abstracts away from perceptual features and generalizes across different forms of evidence. Moreover, in a subregion of MPFC, this neural representation generalized to trials involving subjectively experienced emotional events, suggesting partial overlap in neural responses to attributed and experienced emotions. These data provide a step toward understanding how the brain transforms stimulus-bound inputs into abstract representations of emotion. 
520 |a National Institutes of Health (U.S.) (Grant 1R01 MH096914-01A1) 
546 |a en_US 
655 7 |a Article 
773 |t Journal of Neuroscience