The multivariate temporal response function (mTRF) toolbox: a MATLAB toolbox for relating neural signals to continuous stimuli

MJ Crosse, GM Di Liberto, A Bednar… - Frontiers in human …, 2016 - frontiersin.org
Understanding how brains process sensory signals in natural environments is one of the key
goals of twenty-first century neuroscience. While brain imaging and invasive …

Auditory-inspired speech envelope extraction methods for improved EEG-based auditory attention detection in a cocktail party scenario

W Biesmans, N Das, T Francart… - IEEE transactions on …, 2016 - ieeexplore.ieee.org
This paper considers the auditory attention detection (AAD) paradigm, where the goal is to
determine which of two simultaneous speakers a person is attending to. The paradigm relies …

Congruent visual speech enhances cortical entrainment to continuous auditory speech in noise-free conditions

MJ Crosse, JS Butler, EC Lalor - Journal of Neuroscience, 2015 - Soc Neuroscience
Congruent audiovisual speech enhances our ability to comprehend a speaker, even in
noise-free conditions. When incongruent auditory and visual information is presented …

Lip-reading enables the brain to synthesize auditory features of unknown silent speech

M Bourguignon, M Baart, EC Kapnoula… - Journal of …, 2020 - Soc Neuroscience
Lip-reading is crucial for understanding speech in challenging conditions. But how the brain
extracts meaning from, silent, visual speech is still under debate. Lip-reading in silence …

Neurophysiological indices of audiovisual speech processing reveal a hierarchy of multisensory integration effects

AE O'Sullivan, MJ Crosse, GM Di Liberto… - Journal of …, 2021 - Soc Neuroscience
Seeing a speaker's face benefits speech comprehension, especially in challenging listening
conditions. This perceptual benefit is thought to stem from the neural integration of visual …

[HTML][HTML] A representation of abstract linguistic categories in the visual system underlies successful lipreading

AR Nidiffer, CZ Cao, A O'Sullivan, EC Lalor - NeuroImage, 2023 - Elsevier
There is considerable debate over how visual speech is processed in the absence of sound
and whether neural activity supporting lipreading occurs in visual brain areas. Much of the …

Visual cortical entrainment to motion and categorical speech features during silent lipreading

AE O'Sullivan, MJ Crosse, GM Di Liberto… - Frontiers in human …, 2017 - frontiersin.org
Speech is a multisensory percept, comprising an auditory and visual component. While the
content and processing pathways of audio speech have been well characterized, the visual …

Shared and modality-specific brain regions that mediate auditory and visual word comprehension

A Keitel, J Gross, C Kayser - Elife, 2020 - elifesciences.org
Visual speech carried by lip movements is an integral part of communication. Yet, it remains
unclear in how far visual and acoustic speech comprehension are mediated by the same …

[HTML][HTML] Auditory detection is modulated by theta phase of silent lip movements

E Biau, D Wang, H Park, O Jensen… - Current Research in …, 2021 - Elsevier
Audiovisual speech perception relies, among other things, on our expertise to map a
speaker's lip movements with speech sounds. This multimodal matching is facilitated by …

Hearing through lip-reading: the brain synthesizes features of absent speech

M Bourguignon, M Baart, EC Kapnoula, N Molinaro - bioRxiv, 2018 - biorxiv.org
Lip-reading is crucial to understand speech in challenging conditions. Neuroimaging
investigations have revealed that lip-reading activates auditory cortices in individuals …