کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
406426 678084 2015 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
A discrete gradient method to enhance the numerical behaviour of Hopfield networks
ترجمه فارسی عنوان
یک روش شیب گسسته برای افزایش رفتار عددی شبکه هاپفیلد
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

This paper presents the construction of a numerical method for implementing algorithms that are based on a gradient flow. In particular, continuous Hopfield networks for solving optimization problems are considered as a case in point. The focus is the preservation of the favourable properties of the continuous system under discretization. Firstly, the conventional discretization is formulated as a non-standard numerical method that solves the continuous equation, depending on the step size. A rigourous theoretical analysis shows that it is a consistent method, but it fails to preserve the gradient nature, since periodic solutions occur, so no Lyapunov function can exist. Next we present the construction of a method that preserves the Lyapunov function of the original differential equation regardless of the step size. This procedure, based upon discrete gradients, yields an algorithm that preserves the Lyapunov function of the continuous system, thus reproducing the same qualitative behaviour, namely stability of equilibria and convergence to solutions. A remarkable property of the proposed technique is that it can be computed explicitly as long as the Lyapunov function is multi-linear, which is the case of Hopfield neural networks. This results in enhanced performance, compared to the conventional discretization, as shown in a comprehensive set of numerical experiments.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 164, 21 September 2015, Pages 45–55
نویسندگان
, , , ,