Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
406925 | Neurocomputing | 2013 | 4 Pages |
The training algorithm studied in this paper is inspired by the biological metaplasticity property of neurons. During the training phase, the Artificial Metaplasticity Learning Algorithm could be considered a new probabilistic version of the presynaptic rule, as during this phase the algorithm assigns higher values for updating the weights in the less probable activations than in the ones with higher probability. The algorithm is proposed for Artificial Neural Networks in general, although results at the moment have only been implemented and tested for Multilayer Perceptrons. Tested on different multidisciplinary applications, experiments show a much more efficient training, improving also Multilayer Perceptron results till the performance of the best systems in the state of the art, systems that usually are much more complex.