Article ID Journal Published Year Pages File Type
406323 Neurocomputing 2015 9 Pages PDF
Abstract

Being two famous neural networks, the error back-propagation (BP) algorithm based neural networks (i.e., BP-type neural networks, BPNNs) and Hopfield-type neural networks (HNNs) have been proposed, developed, and investigated extensively for scientific research and engineering applications. They are different from each other in a great deal, in terms of network architecture, physical meaning and training pattern. In this paper of literature-review type, we present in a relatively complete and creative manner the common natures of learning between BP-type and Hopfield-type neural networks for solving various (mathematical) problems. Specifically, comparing the BPNN with the HNN for the same problem-solving task, e.g., matrix inversion as well as function approximation, we show that the BPNN weight-updating formula and the HNN state-transition equation turn out to be essentially the same. Such interesting phenomena promise that, given a neural-network model for a specific problem solving, its potential dual neural-network model can thus be developed.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,