Article ID Journal Published Year Pages File Type
408534 Neurocomputing 2008 4 Pages PDF
Abstract

Donald Hebb postulated that if neurons fire together they wire together. However, Hebbian learning is inherently unstable because synaptic weights will self-amplify themselves: the more a synapse drives a postsynaptic cell the more the synaptic weight will grow. We present a new biologically realistic way of showing how to stabilise synaptic weights by introducing a third factor which switches learning on or off so that self-amplification is minimised. The third factor can be identified by the activity of dopaminergic neurons in ventral tegmental area which leads to a new interpretation of the dopamine signal which goes beyond the classical prediction error hypothesis.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,