Article ID Journal Published Year Pages File Type
403773 Neural Networks 2016 13 Pages PDF
Abstract

Weight elimination offers a simple and efficient improvement of training algorithm of feedforward neural networks. It is a general regularization technique in terms of the flexible scaling parameters. Actually, the weight elimination technique also contains the weight decay regularization for a large scaling parameter. Many applications of this technique and its improvements have been reported. However, there is little research concentrated on its convergence behavior. In this paper, we theoretically analyze the weight elimination for cyclic learning method and determine the conditions for the uniform boundedness of weight sequence, and weak and strong convergence. Based on the assumed network parameters, the optimal choice for the scaling parameter can also be determined. Moreover, two illustrative simulations have been done to support the theoretical explorations as well.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,