| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 4643259 | Journal of Computational and Applied Mathematics | 2006 | 13 Pages |
Abstract
In this paper a globally convergent first-order training algorithm is proposed that uses sign-based information of the batch error measure in the framework of the nonlinear Jacobi process. This approach allows us to equip the recently proposed Jacobi–Rprop method with the global convergence property, i.e. convergence to a local minimizer from any initial starting point. We also propose a strategy that ensures the search direction of the globally convergent Jacobi–Rprop is a descent one. The behaviour of the algorithm is empirically investigated in eight benchmark problems. Simulation results verify that there are indeed improvements on the convergence success of the algorithm.
Keywords
Related Topics
Physical Sciences and Engineering
Mathematics
Applied Mathematics
Authors
Aristoklis D. Anastasiadis, George D. Magoulas, Michael N. Vrahatis,
