Audio-visual integration during overt visual attention

C Quigley, S Onat, S Harding, M Cooke… - Journal of Eye …, 2007 - bop.unibe.ch
Journal of Eye Movement Research, 2007bop.unibe.ch
How do different sources of information arising from different modalities interact to control
where we look? To answer this question with respect to real-world operational conditions we
presented natural images and spatially localized sounds in (V) isual, Audio-visual (AV) and
(A) uditory conditions and measured subjects' eye-movements. Our results demonstrate that
eye-movements in AV conditions are spatially biased towards the part of the image
corresponding to the sound source. Interestingly, this spatial bias is dependent on the …
Abstract
How do different sources of information arising from different modalities interact to control where we look? To answer this question with respect to real-world operational conditions we presented natural images and spatially localized sounds in (V) isual, Audio-visual (AV) and (A) uditory conditions and measured subjects' eye-movements. Our results demonstrate that eye-movements in AV conditions are spatially biased towards the part of the image corresponding to the sound source. Interestingly, this spatial bias is dependent on the probability of a given image region to be fixated (saliency) in the V condition. This indicates that fixation behaviour during the AV conditions is the result of an integration process. Regression analysis shows that this integration is best accounted for by a linear combination of unimodal saliencies.
bop.unibe.ch
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References