Article ID Journal Published Year Pages File Type
410470 Neurocomputing 2009 7 Pages PDF
Abstract

The model of attractor neural network on the small-world topology (local and random connectivity) is investigated. The synaptic weights are random, driving the network towards a disordered state for the neural activity. An ordered macroscopic neural state is induced by a bias in the network weight connections, and the network evolution when initialized in blocks of positive/negative activity is studied. The retrieval of the block-like structure is investigated. An application to the Hebbian learning of a pattern, carrying local information, is presented. The block and the global attractor compete according to the initial conditions and the change of stability from one to the other depends on the long-range character of the network connectivity, as shown with a flow-diagram analysis. Moreover, a larger number of blocks emerges with the network dilution.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,