Article ID Journal Published Year Pages File Type
404384 Neural Networks 2011 9 Pages PDF
Abstract

Neurodynamical models of working memory (WM) should provide mechanisms for storing, maintaining, retrieving, and deleting information. Many models address only a subset of these aspects. Here we present a rather simple WM model in which all of these performance modes are trained into a recurrent neural network (RNN) of the echo state network (ESN) type. The model is demonstrated on a bracket level parsing task with a stream of rich and noisy graphical script input. In terms of nonlinear dynamics, memory states correspond, intuitively, to attractors in an input-driven system. As a supplementary contribution, the article proposes a rigorous formal framework to describe such attractors, generalizing from the standard definition of attractors in autonomous (input-free) dynamical systems.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,