Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4977722 | Signal Processing | 2017 | 5 Pages |
Abstract
In a complex-valued phasor Hopfield neural network with a training pattern, only the rotated patterns are fixed points, while in a complex-valued K-state Hopfield neural network, there exist K fixed points. In the case of a quaternionic Hopfield neural network (QHNN) with a continuous activation function, again only the rotated patterns are fixed points. We consider a QHNN with a split activation function, which is a 16-state activation function. This type of QHNN is referred to as a split QHNN (SQHNN). It is expected to have 16 fixed points, all of which are global minima. The rate at which the training pattern would be recalled from random initial states would thus be expected to be 1/16. However, the rate was higher in our computer simulations. We investigate the reasons for this discrepancy.
Related Topics
Physical Sciences and Engineering
Computer Science
Signal Processing
Authors
Masaki Kobayashi,