Article ID Journal Published Year Pages File Type
408799 Neurocomputing 2009 4 Pages PDF
Abstract

The gradient descent algorithms like backpropagation (BP) or its variations on multilayered feed-forward networks are widely used in many applications, especially on solving differential equations. Reformulated radial basis function networks (RBFN) are expected to have more accuracy in generalization capability than BP according to the regularization theory. We show how to apply the both networks to a specific example of differential equations and compare the capability of generalization and convergence. The experimental comparison of various approaches clarifies that reformulated RBFN shows better performance than BP for solving a specific example of differential equations.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,