Article ID Journal Published Year Pages File Type
408736 Neurocomputing 2006 4 Pages PDF
Abstract

Hebbian learning has been a staple of neural-network models for many years. It is well known that the most straight-forward implementations of this popular learning rule lead to unconstrained weight growth. A newly discovered property of cortical neurons is that they try to maintain a preset average firing rate [G.G. Turrigiano, S.B. Nelson, Homeostatic plasticity in the developing nervous system, Nat. Rev. Neurosci. 5 (2004) 97–107]. We use this property to control the Hebbian learning process in a self-organizing map network. In this article, the practicality of this type of learning rule is expanded by deriving a scaling equation for the learning rates for various network architectures.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,