Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood

Language is a multimodal process with visual and auditory cues playing important roles in understanding speech. A well-controlled paradigm with audiovisually matched and mismatched syllables is often used to capture audiovisual (AV) speech processing. The ability to detect and integrate mismatching...

Full description

Bibliographic Details
Main Author: Finch, Kayla
Other Authors: Tager-Flusberg, Helen
Language:en_US
Published: 2019
Subjects:
Online Access:https://hdl.handle.net/2144/38792
id ndltd-bu.edu-oai-open.bu.edu-2144-38792
record_format oai_dc
spelling ndltd-bu.edu-oai-open.bu.edu-2144-387922019-12-15T03:05:35Z Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood Finch, Kayla Tager-Flusberg, Helen Developmental psychology Audiovisual speech Autism spectrum disorder ERPs Eye-tracking Language McGurk effect Language is a multimodal process with visual and auditory cues playing important roles in understanding speech. A well-controlled paradigm with audiovisually matched and mismatched syllables is often used to capture audiovisual (AV) speech processing. The ability to detect and integrate mismatching cues shows large individual variability across development and is linked to later language in typical development (TD) and social abilities in autism spectrum disorder (ASD). However, no study has used a multimethod approach to better understand AV speech processing in early development. The studies’ aims were to examine behavioral performance, gaze patterns, and neural indices of AV speech in: 1) TD preschoolers (N=60; females=35) and 2) infants at risk for developing ASD (high-risk, HR; N=37; females=10) and TD controls (low-risk, LR; N=42; females=21). In Study 1, I investigated preschoolers’ gaze patterns and behavioral performance when presented with matched and mismatched AV speech and visual-only (lipreading) speech. As hypothesized, lipreading abilities were associated with children’s ability to integrate mismatching AV cues, and children looked towards the mouth when visual cues were helpful, specifically in lipreading conditions. Unexpectedly, looking time towards the mouth was not associated with the children’s ability to integrate mismatching AV cues. Study 2 examined how visual cues of AV speech modulated auditory event-related potentials (ERPs), and associations between ERPs and preschoolers’ behavioral performance during an AV speech task. As hypothesized, the auditory ERPs were attenuated during AV speech compared to auditory-only speech. Additionally, individual differences in their neural processing of auditory and visual cues predicted which cue the child attended to in mismatched AV speech. In Study 3, I investigated ERPs of AV speech in LR and HR 12-month-olds and their association with language abilities at 18-months. Unexpectedly, I found no group differences: all infants were able to detect mismatched AV speech as measured through a more negative ERP response. As hypothesized, more mature neural processing of AV speech integration, measured as a more positive ERP response to fusible AV cues, predicted later language across all infants. These results highlight the importance of using multimethod approaches to understand variability in AV speech processing at two developmental stages. 2021-11-12T00:00:00Z 2019-12-13T16:48:59Z 2019 2019-11-12T20:01:49Z Thesis/Dissertation https://hdl.handle.net/2144/38792 0000-0001-6357-8537 en_US
collection NDLTD
language en_US
sources NDLTD
topic Developmental psychology
Audiovisual speech
Autism spectrum disorder
ERPs
Eye-tracking
Language
McGurk effect
spellingShingle Developmental psychology
Audiovisual speech
Autism spectrum disorder
ERPs
Eye-tracking
Language
McGurk effect
Finch, Kayla
Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
description Language is a multimodal process with visual and auditory cues playing important roles in understanding speech. A well-controlled paradigm with audiovisually matched and mismatched syllables is often used to capture audiovisual (AV) speech processing. The ability to detect and integrate mismatching cues shows large individual variability across development and is linked to later language in typical development (TD) and social abilities in autism spectrum disorder (ASD). However, no study has used a multimethod approach to better understand AV speech processing in early development. The studies’ aims were to examine behavioral performance, gaze patterns, and neural indices of AV speech in: 1) TD preschoolers (N=60; females=35) and 2) infants at risk for developing ASD (high-risk, HR; N=37; females=10) and TD controls (low-risk, LR; N=42; females=21). In Study 1, I investigated preschoolers’ gaze patterns and behavioral performance when presented with matched and mismatched AV speech and visual-only (lipreading) speech. As hypothesized, lipreading abilities were associated with children’s ability to integrate mismatching AV cues, and children looked towards the mouth when visual cues were helpful, specifically in lipreading conditions. Unexpectedly, looking time towards the mouth was not associated with the children’s ability to integrate mismatching AV cues. Study 2 examined how visual cues of AV speech modulated auditory event-related potentials (ERPs), and associations between ERPs and preschoolers’ behavioral performance during an AV speech task. As hypothesized, the auditory ERPs were attenuated during AV speech compared to auditory-only speech. Additionally, individual differences in their neural processing of auditory and visual cues predicted which cue the child attended to in mismatched AV speech. In Study 3, I investigated ERPs of AV speech in LR and HR 12-month-olds and their association with language abilities at 18-months. Unexpectedly, I found no group differences: all infants were able to detect mismatched AV speech as measured through a more negative ERP response. As hypothesized, more mature neural processing of AV speech integration, measured as a more positive ERP response to fusible AV cues, predicted later language across all infants. These results highlight the importance of using multimethod approaches to understand variability in AV speech processing at two developmental stages. === 2021-11-12T00:00:00Z
author2 Tager-Flusberg, Helen
author_facet Tager-Flusberg, Helen
Finch, Kayla
author Finch, Kayla
author_sort Finch, Kayla
title Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
title_short Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
title_full Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
title_fullStr Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
title_full_unstemmed Neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
title_sort neural indices and looking behaviors of audiovisual speech processing in infancy and early childhood
publishDate 2019
url https://hdl.handle.net/2144/38792
work_keys_str_mv AT finchkayla neuralindicesandlookingbehaviorsofaudiovisualspeechprocessingininfancyandearlychildhood
_version_ 1719303427771596800