Máté Aller
Máté Aller
Home
Projects
Publications
CV
Light
Dark
Automatic
Speech perception
Unimodal speech perception predicts stable individual differences in audiovisual benefit for phonemes, words and sentences
There are substantial individual differences in the benefit that can be obtained from visual cues during speech perception. Here, 113 …
Jacqueline von Seth
,
Máté Aller
,
Matthew H. Davis
PDF
Cite
DOI
URL
How the brain combines prior expectations with heard speech during speech perception?
M/EEG project to understand the temporal dynamics of sharpeining and prediction error computations in the brain during speech perception.
The neural bases of audio-visual benefit in speech perception
Speech perception in noisy environments is enhanced by seeing facial movements of communication partners. However, the neural mechanisms by which audio and visual speech are combined are not fully understood. This project aims to elucidate these neural mechanisms.
Paper
Code
Data
Cite
×