Article ID Journal Published Year Pages File Type
407610 Neurocomputing 2013 5 Pages PDF
Abstract

This paper analyzes the robustness of global exponential stability of recurrent neural networks subject to parameter uncertainty in connection weight matrix. Given a globally exponentially stable recurrent neural network, the problem to be addressed herein is how much parameter uncertainty in the connection weight matrix that the neural network can remain to be globally exponentially stable. We characterize the upper bounds of the parameter uncertainty for the recurrent neural networks to sustain global exponential stability. A numerical example is provided to illustrate the theoretical result.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,