Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10326069 | Neural Networks | 2005 | 13 Pages |
Abstract
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Thomas Voegtlin,