Article ID Journal Published Year Pages File Type
6023999 NeuroImage 2016 12 Pages PDF
Abstract
Current hypotheses about language processing advocate an integral relationship between encoding of temporal information and linguistic processing in the brain. All such explanations must accommodate the evident ability of the perceptual system to process both slow and fast time scales in speech. However most cortical neurons are limited in their capability to precisely synchronise to temporal modulations at rates faster than about 50 Hz. Hence, a central question in auditory neurophysiology concerns how the full range of perceptually relevant modulation rates might be encoded in the cerebral cortex. Here we show with concurrent noninvasive magnetoencephalography (MEG) and electroencephalography (EEG) measurements that the human auditory cortex transitions between a phase-locked (PL) mode of responding to modulation rates below about 50 Hz, and a non-phase-locked (NPL) mode at higher rates. Precisely such dual response modes are predictable from the behaviours of single neurons in auditory cortices of non-human primates. Our data point to a common mechanistic explanation for the single neuron and MEG/EEG results and support the hypothesis that two distinct types of neuronal encoding mechanisms are employed by the auditory cortex to represent a wide range of temporal modulation rates. This dual encoding model allows slow and fast modulations in speech to be processed in parallel and is therefore consistent with theoretical frameworks in which slow temporal modulations (such as rhythm or syllabic structure) are akin to the contours or edges of visual objects, whereas faster modulations (such as periodicity pitch or phonemic structure) are more like visual texture.
Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, , ,