Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
11004673 | Neuropsychologia | 2018 | 10 Pages |
Abstract
Humans' and non-human animals' ability to process time on the scale of milliseconds and seconds is essential for adaptive behaviour. A central question of how brains keep track of time is how specific temporal information across different sensory modalities is. In the present study, we show that encoding of temporal intervals in auditory and visual modalities are qualitatively similar. Human participants were instructed to reproduce intervals in the range from 750â¯ms to 1500â¯ms marked by auditory or visual stimuli. Our behavioural results suggest that, although participants were more accurate in reproducing intervals marked by auditory stimuli, there was a strong correlation in performance between modalities. Using multivariate pattern analysis in scalp EEG, we show that activity during late periods of the intervals was similar within and between modalities. Critically, we show that a multivariate pattern classifier was able to accurately predict the elapsed interval, even when trained on an interval marked by a stimulus of a different sensory modality. Taken together, our results suggest that, while there are differences in the processing of intervals marked by auditory and visual stimuli, they also share a common neural representation.
Related Topics
Life Sciences
Neuroscience
Behavioral Neuroscience
Authors
Louise C. Barne, João R. Sato, Raphael Y. de Camargo, Peter M.E. Claessens, Marcelo S. Caetano, André M. Cravo,