Article ID Journal Published Year Pages File Type
4356265 Hearing Research 2007 12 Pages PDF
Abstract
Mutual information (MI) is in increasing use as a way of quantifying neural responses. However, it is still considered with some doubts by many researchers, because it is not always clear what MI really measures, and because MI is hard to calculate in practice. This paper aims to clarify these issues. First, it provides an interpretation of mutual information as variability decomposition, similar to standard variance decomposition routinely used in statistical evaluations of neural data, except that the measure of variability is entropy rather than variance. Second, it discusses those aspects of the MI that makes its calculation difficult. The goal of this paper is to clarify when and how information theory can be used informatively and reliably in auditory neuroscience.
Related Topics
Life Sciences Neuroscience Sensory Systems
Authors
, ,