Article ID Journal Published Year Pages File Type
6262802 Brain Research 2015 9 Pages PDF
Abstract

•Predictive coding of visual-auditory and motor-auditory events was compared.•Sensory prediction was tested by auditory omissions.•Omission responses of visual-auditory and motor-auditory events were alike.•Motor- and visual-auditory prediction share a common neural circuitry.

The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one's own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual-auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex.This article is part of a Special Issue entitled SI: Prediction and Attention.

Related Topics
Life Sciences Neuroscience Neuroscience (General)
Authors
, ,