Article ID Journal Published Year Pages File Type
6260436 Current Opinion in Behavioral Sciences 2016 6 Pages PDF
Abstract

•A non-associative model of associative learning.•Focus on the encoding of intervals into memory.•Partly inspired by results showing interval memory in Purkinje cells.•Founded on minimum description length principle for stochastic model selection.•Explains cue competition, response timing and the parametric invariances.

Two principles of information theory - maximum entropy and minimum description length - motivate a computational model of associative learning that explains assignment of credit, response timing, and the parametric invariances. The maximum entropy principle gives rise to two distributions - the exponential and the evitable Gaussian - which naturally lend themselves to inference involving each of the two fundamental classes of predictors - enduring states and point events. These distributions are the 'atoms' from which more complex representations are built. The representation that is 'learned' is determined by the principle of minimum-description-length. In this theory, learning is a synonym for data compression: The representation of its experience that the animal learns is the representation that best allows the data of experience to be compressed.

Graphical abstractDownload full-size image

Related Topics
Life Sciences Neuroscience Behavioral Neuroscience
Authors
, ,