Article ID Journal Published Year Pages File Type
409846 Neurocomputing 2012 7 Pages PDF
Abstract

The backpropagation (BP) algorithm is widely recognized as a powerful tool for training feedforward neural networks (FNNs). However, since the algorithm employs the steepest descent technique to adjust the network weights, it suffers from a slow convergence rate and often produces suboptimal solutions, which are the two major drawbacks of BP. This paper proposes a modified BP algorithm which can remarkably alleviate the problem of local minima confronted with by the standard BP (SBP). As one output of the modified training procedure, a bucket of all the possible solutions of weights matrices found during training is acquired, among which the best solution is chosen competitively based upon their performances on a validation dataset. Simulations are conducted on four benchmark classification tasks to compare and evaluate the classification performances and generalization capabilities of the proposed modified BP and SBP.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,