Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4946694 | Neural Networks | 2017 | 31 Pages |
Abstract
Fractional calculus has been found to be a promising area of research for information processing and modeling of some physical systems. In this paper, we propose a fractional gradient descent method for the backpropagation (BP) training of neural networks. In particular, the Caputo derivative is employed to evaluate the fractional-order gradient of the error defined as the traditional quadratic energy function. The monotonicity and weak (strong) convergence of the proposed approach are proved in detail. Two simulations have been implemented to illustrate the performance of presented fractional-order BP algorithm on three small datasets and one large dataset. The numerical simulations effectively verify the theoretical observations of this paper as well.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Jian Wang, Yanqing Wen, Yida Gou, Zhenyun Ye, Hua Chen,