Massimo Lumaca, Niels Trusbak Haumann, Elvira Brattico, Manon Grube, Peter Vuust
The human brain's ability to extract and encode temporal regularities and to predict the timing of upcoming events is critical for music and speech perception. This work addresses how these mechanisms deal with different levels of temporal complexity, here the number of distinct durations in rhythmic patterns. We use electroencephalography (EEG) to relate the mismatch negativity (MMN), a proxy of neural prediction error, to a measure of information content of rhythmic sequences, the Shannon entropy. Within each of three conditions, participants listened to repeatedly presented standard rhythms of five tones (four inter-onset intervals) and of a given level of entropy: zero (isochronous), medium entropy (two distinct interval durations), or high entropy (four distinct interval durations)...
June 2019: European Journal of Neuroscience