کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1897625 1044556 2006 12 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات ریاضیات کاربردی
پیش نمایش صفحه اول مقاله
Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
چکیده انگلیسی

The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Hölder functions.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Physica D: Nonlinear Phenomena - Volume 214, Issue 1, 1 February 2006, Pages 88–99
نویسندگان
, , , ,