S Gleiss, C Kayser - Neuropsychologia, 2014 - Elsevier
Multisensory interactions shape every day perception and stimuli in one modality can enhance perception in another even when not being directly task relevant. While the …
Motion is represented by low-level signals, such as size-expansion in vision or loudness changes in the auditory modality. The visual and auditory signals from the same object or …
AL Beer, B Röder - Cognitive Brain Research, 2004 - Elsevier
The present event-related potential (ERP) study investigated whether attending to a particular direction of motion similarly enhances the processing of auditory and visual …
Results from event-related potential (ERP) studies are reviewed that investigated crossmodal links in spatial attention between vision, audition and touch to find out which …
SL Simon-Dack, WA Teder-Sälejärvi - Brain research, 2008 - Elsevier
Multisensory integration and interaction occur when bimodal stimuli are presented as either spatially congruent or incongruent, but temporally coincident. We investigated whether …
K Schmiedchen, C Freigang, I Nitsche, R Rübsamen - Brain research, 2012 - Elsevier
Motion perception can be altered by information received through multiple senses. So far, the interplay between the visual and the auditory modality in peripheral motion perception is …
M Eimer - Clinical neurophysiology, 1999 - Elsevier
Objectives: An event-related brain potential (ERP) study investigated whether spatially selective processing in vision and audition is controlled by a single supramodal system or by …
Recent behavioral and event-related brain potential (ERP) studies have revealed cross- modal interactions in endogenous spatial attention between vision and audition, plus vision …
K Hötting, F Rösler, B Röder - Experimental Brain Research, 2003 - Springer
An increasing number of animal and human studies suggests that different sensory systems share spatial representations in the brain. The aim of the present study was to test whether …