Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
412797 | Neurocomputing | 2010 | 7 Pages |
Abstract
Hopfield Neural Networks are well-suited to the fast solution of complex optimization problems. Their application to real problems usually requires the satisfaction of a set of linear constraints that can be incorporated with an additional violation term. Another option proposed in the literature lies in confining the search space onto the subspace of constraints in such a way that the neuron outputs always satisfy the imposed restrictions. This paper proposes a computationally efficient subspace projection method that also includes variable updating step mechanisms. Some numerical experiments are used to verify the good performance and fast convergence of the new method.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Daniel Calabuig, Sonia Gimenez, Jose E. Roman, Jose F. Monserrat,