Article ID Journal Published Year Pages File Type
9653575 Neurocomputing 2005 15 Pages PDF
Abstract
The global exponential stability is further discussed for a class of delayed recurrent neural networks with Lipschitz-continuous activation functions. By constructing new Lyapunov functional and applying an elementary inequality technique, a set of new conditions with less restriction and less conservativeness are proposed for determining global exponential stability of the delayed neural network model with more general activation functions. The proposed results improve and generalize some previous reports in the literature. Several examples are also given to illustrate the validity and advantages of the new criteria.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,