Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10326083 | Neural Networks | 2005 | 8 Pages |
Abstract
This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the present results which extend the existing ones.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Sanqing Hu, Derong Liu,