Dynamic models for musical rhythm perception and coordination

EW Large, I Roman, JC Kim, J Cannon… - Frontiers in …, 2023 - frontiersin.org
Rhythmicity permeates large parts of human experience. Humans generate various motor
and brain rhythms spanning a range of frequencies. We also experience and synchronize to …

Neural tracking of continuous acoustics: properties, speech‐specificity and open questions

B Zoefel, A Kösem - European Journal of Neuroscience, 2024 - Wiley Online Library
Human speech is a particularly relevant acoustic stimulus for our species, due to its role of
information transmission during communication. Speech is inherently a dynamic signal, and …

Modeling enculturated bias in entrainment to rhythmic patterns

T Kaplan, J Cannon, L Jamone… - PLOS Computational …, 2022 - journals.plos.org
Long-term and culture-specific experience of music shapes rhythm perception, leading to
enculturated expectations that make certain rhythms easier to track and more conducive to …

Infant low-frequency EEG cortical power, cortical tracking and phase-amplitude coupling predicts language a year later

A Attaheri, Á Ní Choisdealbha, S Rocha, P Brusini… - PloS one, 2024 - journals.plos.org
Cortical signals have been shown to track acoustic and linguistic properties of continuous
speech. This phenomenon has been measured in both children and adults, reflecting …

Dog–human vocal interactions match dogs' sensory-motor tuning

EC Déaux, T Piette, F Gaunet, T Legou, L Arnal… - Plos …, 2024 - journals.plos.org
Within species, vocal and auditory systems presumably coevolved to converge on a critical
temporal acoustic structure that can be best produced and perceived. While dogs cannot …

Rats synchronize predictively to metronomes

VG Rajendran, Y Tsdaka, TY Keung, JWH Schnupp… - iScience, 2024 - cell.com
Predictive auditory-motor synchronization, in which rhythmic movements anticipate rhythmic
sounds, is at the core of the human capacity for music. Rodents show impressive capabilities …

[HTML][HTML] Decoding speech information from EEG data with 4-, 7-and 11-month-old infants: Using convolutional neural network, mutual information-based and …

M Keshavarzi, ÁN Choisdealbha, A Attaheri… - Journal of Neuroscience …, 2024 - Elsevier
Background Computational models that successfully decode neural activity into speech are
increasing in the adult literature, with convolutional neural networks (CNNs), backward …

Dogs' sensory-motor tuning shapes dog-human vocal interactions

EC Déaux, T Piette, F Gaunet, T Legou, L Arnal… - bioRxiv, 2023 - biorxiv.org
Within species, vocal and auditory systems co-evolve to converge on a critical temporal
acoustic structure that can be best produced and perceived. While dogs cannot produce …

Impact des statistiques temporelles sur le traitement des stimuli auditifs

P Bonnet - 2024 - theses.hal.science
Les régularités temporelles du contexte sont connues pour affecter la perception d'un
prochain événement sensoriel. Par exemple, lorsque nous écoutons un métronome, nous …

[PDF][PDF] Adaptive pacing in word segmentation and the Vowel-onset Paced Syllable Inference model

B Pittman-Polletta, L Dilley - PsyArXiv. November, 2023 - files.osf.io
In speech perception, timing and content are interdependent. For example, in distal rate
effects, context speech rate determines the number of words, syllables, and phonemes …