Article ID Journal Published Year Pages File Type
405512 Neural Networks 2012 9 Pages PDF
Abstract

Weight decay method as one of classical complexity regularization methods is simple and appears to work well in some applications for backpropagation neural networks (BPNN). This paper shows results for the weak and strong convergence for cyclic and almost cyclic learning BPNN with penalty term (CBP-P and ACBP-P). The convergence is guaranteed under certain relaxed conditions for activation functions, learning rate and under the assumption for the stationary set of error function. Furthermore, the boundedness of the weights in the training procedure is obtained in a simple and clear way. Numerical simulations are implemented to support our theoretical results and demonstrate that ACBP-P has better performance than CBP-P on both convergence speed and generalization ability.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,