Article ID Journal Published Year Pages File Type
1710575 Applied Mathematics Letters 2006 6 Pages PDF
Abstract

In this work we consider a general class of continuous activation functions which may be neither bounded nor differentiable; however, many sigmoidal functions are included as special cases. With this class of activation functions we give a result on asymptotic stability for neural networks under a weak condition of nonnegative definiteness. Then we show that differentiability is a condition for its exponential stability.

Related Topics
Physical Sciences and Engineering Engineering Computational Mechanics
Authors
,