Bayesian integration of visual and auditory signals for spatial localization

PW Battaglia, RA Jacobs, RN Aslin - Josa a, 2003 - opg.optica.org
Josa a, 2003opg.optica.org
Human observers localize events in the world by using sensory signals from multiple
modalities. We evaluated two theories of spatial localization that predict how visual and
auditory information are weighted when these signals specify different locations in space.
According to one theory (visual capture), the signal that is typically most reliable dominates
in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation)
proposes that perceptual judgments are based on a weighted average of the sensory …
Human observers localize events in the world by using sensory signals from multiple modalities. We evaluated two theories of spatial localization that predict how visual and auditory information are weighted when these signals specify different locations in space. According to one theory (visual capture), the signal that is typically most reliable dominates in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation) proposes that perceptual judgments are based on a weighted average of the sensory signals in proportion to each signal’s relative reliability. Our results indicate that both theories are partially correct, in that relative signal reliability significantly altered judgments of spatial location, but these judgments were also characterized by an overall bias to rely on visual over auditory information. These results have important implications for the development of cue integration and for neural plasticity in the adult brain that enables humans to optimally integrate multimodal information.
opg.optica.org
以上显示的是最相近的搜索结果。 查看全部搜索结果