Article ID Journal Published Year Pages File Type
838470 Nonlinear Analysis: Real World Applications 2007 11 Pages PDF
Abstract

This paper discusses the global output convergence of a class of recurrent neural networks with distributed delays. The inputs of the neural networks are required to be time varying and the activation functions to be globally continuous and monotone nondecreasing. By using the definiteness of matrix and the properties of M-matrix, several sufficient conditions are established to guarantee the global output convergence of this class of neural networks. Symmetry in the connection weight matrices and the boundedness of the activation functions are not required in this paper. The convergence results are useful in solving some optimization problems and in the design of recurrent neural networks with distributed delays.

Related Topics
Physical Sciences and Engineering Engineering Engineering (General)
Authors
, ,