Audiovisual Interaction: A Case of Time Perception

博士 === 國立臺灣大學 === 心理學研究所 === 99 === We focus on time perception to test two approaches about multisensory perception. The modality appropriateness hypothesis states that audition determines the temporal judgment of audiovisual stimuli whereas the maximum likelihood estimation (MLE) model proposes an...

Full description

Bibliographic Details
Main Authors: Kuan-Ming Chen, 陳冠銘
Other Authors: Su-Ling Yeh
Format: Others
Language:en_US
Published: 2011
Online Access:http://ndltd.ncl.edu.tw/handle/91209084086608681134
Description
Summary:博士 === 國立臺灣大學 === 心理學研究所 === 99 === We focus on time perception to test two approaches about multisensory perception. The modality appropriateness hypothesis states that audition determines the temporal judgment of audiovisual stimuli whereas the maximum likelihood estimation (MLE) model proposes an optimal cue-combination so that visual and auditory signals are integrated based on the weighting scheme that is proportional to the estimate reliability of a signal. In Experiment 1 observers compared the duration between two intervals based on the common modality. The standard stimulus was presented in visual or auditory modality, and the comparison stimulus was presented in visual, auditory, or both modalities. The results support the modality appropriateness hypothesis in that the sound expanded the perceived visual duration whereas the disk did not affect the perceived auditory duration. In Experiment 2 the standard stimulus was set in different durations, and the sound effect on the perceived visual duration was observed only at the intermediate four durations whereas the disk did not affect the perceived auditory duration at all. In Experiment 3 when both the standard and comparison stimuli were presented bimodally, only with the longest duration the bimodal performance was predictable from unimodal performances according to the MLE model. In Experiment 4 the auditory signal was manipulated into different extents of reliability. The contribution of auditory modality to the bimodal estimate was greater than the visual modality, and the prediction of the MLE model only applied to the condition with less reliable auditory signal. In Experiment 5 a looming disk was used to make vision more temporally related. The results showed that visual modality can determine the bimodal judgment. Across five experiments an auditory bias in bimodal time perception was observed. However, with higher reliability of the auditory signal or more temporally related visual signal, the prior auditory bias declined and more contribution of visual signal came into effect. In conclusion, neither the modality appropriateness hypothesis nor the MLE model can explain our results, and a hybrid model was proposed: The auditory bias acts as a prior assumption held by observers in time perception to combine sensory signals, and the signals integrate based on the weighting scheme according to the reliability of signals.