Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
386554 | Expert Systems with Applications | 2010 | 6 Pages |
In this paper, a class of Zhang neural networks (ZNNs) are developed and analyzed on convergence properties. Different from conventional gradient-based neural networks (GNNs), such ZNN is designed based on the idea of measuring the time-derivation information of time-varying coefficients. The general framework of such a ZNN, together with its variant forms, is presented and investigated. The resultant ZNN model activated by linear functions possesses global exponential convergence to the time-varying equilibrium point. By employing proposed new smooth nonlinear odd-monotonically increasing activation functions, superior convergence could be achieved. Computer-simulation examples substantiate the efficacy of such a ZNN model in the context of solution of time-varying generalized linear matrix equations.