Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
409914 | Neurocomputing | 2012 | 4 Pages |
Abstract
The momentum method is a commonly used method to accelerate the learning of neural networks. In this paper, a new adaptive momentum algorithm is proposed for split-complex recurrent neural networks training. Different from other momentum methods, this new algorithm uses a variable gain factor and a variable learning rate to speed up the convergence and smooth the weight trace. The global convergence of the new algorithm is proved under mild conditions. Numerical results show that the algorithm is efficient for the given test problems.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Dongpo Xu, Hongmei Shao, Huisheng Zhang,