Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1890632 | Chaos, Solitons & Fractals | 2008 | 12 Pages |
Abstract
The convergence of neural networks minor component analysis (MCA) learning algorithms is crucial for practical applications. In this paper, we will analyze the global convergence of an adaptive minor component extraction algorithm via a corresponding deterministic discrete time (DDT) system. It is shown that if the learning rate satisfies certain conditions, almost all the trajectories of the DDT system are bounded and converge to minor component of the autocorrelation matrix of input data. Simulations are carried out to illustrate the results achieved.
Related Topics
Physical Sciences and Engineering
Physics and Astronomy
Statistical and Nonlinear Physics
Authors
Dezhong Peng, Zhang Yi,