Neural Architecture of Auditory Object Categorization

We can identify objects by sight or by sound, yet far less is known about auditory object recognition than about visual recognition. Any exemplar of a dog (eg, a picture) can be recognized on multiple categorical levels (eg, animal, dog, poodle). Using fMRI combined with machine-learning techniques,...

Full description

Bibliographic Details
Main Authors: Yune-Sang Lee, Michael Hanke, David Kraemer, Samuel Lloyd, Richard Granger
Format: Article
Language:English
Published: SAGE Publishing 2011-10-01
Series:i-Perception
Online Access:https://doi.org/10.1068/ic756
Description
Summary:We can identify objects by sight or by sound, yet far less is known about auditory object recognition than about visual recognition. Any exemplar of a dog (eg, a picture) can be recognized on multiple categorical levels (eg, animal, dog, poodle). Using fMRI combined with machine-learning techniques, we studied these levels of categorization with sounds rather than images. Subjects heard sounds of various animate and inanimate objects, and unrecognizable control sounds. We report four primary findings: (1) some distinct brain regions selectively coded for basic (“dog”) versus superordinate (“animal”) categorization; (2) classification at the basic level entailed more extended cortical networks than those for superordinate categorization; (3) human voices were recognized far better by multiple brain regions than were any other sound categories; (4) regions beyond temporal lobe auditory areas were able to distinguish and categorize auditory objects. We conclude that multiple representations of an object exist at different categorical levels. This neural instantiation of object categories is distributed across multiple brain regions, including so-called “visual association areas,” indicating that these regions support object knowledge even when the input is auditory. Moreover, our findings appear to conflict with prior well-established theories of category-specific modules in the brain.
ISSN:2041-6695