Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6863158 | Neural Networks | 2018 | 10 Pages |
Abstract
This paper proposes a new learning rule called margined Winner-Take-All (mWTA) for training the deepest layer. Every time when a training pattern is presented during the learning, if the result of recognition by WTA (Winner-Take-All) is an error, a new cell is generated in the deepest layer. Here we put a certain amount of margin to the WTA. In other words, only during the learning, a certain amount of handicap is given to cells of classes other than that of the training vector, and the winner is chosen under this handicap. By introducing the margin to the WTA, we can generate a compact set of cells, with which a high recognition rate can be obtained with a small computational cost. The ability of this mWTA is demonstrated by computer simulation.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Kunihiko Fukushima,