Article ID Journal Published Year Pages File Type
409060 Neurocomputing 2008 11 Pages PDF
Abstract

A new class of search-based training algorithms for feedforward networks is introduced. These algorithms do not calculate analytical gradients and they do not use stochastic or genetic search techniques. The forward step is performed to calculate error in response to localized weight changes using systematic search techniques. One of the simplest variants of this type of algorithms, the variable step search (VSS) algorithm, is studied in details. The VSS search procedure changes one network parameter at a time and thus does not impose any restrictions on the network structure or the type of transfer functions. Rough approximation to the gradient direction and the determination of the optimal step along this direction to find the minimum of cost function are performed simultaneously. Modifying the value of a single weight changes the signals only in a small fragment of the network, allowing for efficient calculation of contributions to errors. Several heuristics are discussed to increase the efficiency of VSS algorithm. Tests on benchmark data show that VSS performs not worse and sometimes even significantly better than such renown algorithms as the Levenberg–Marquardt or the scaled conjugate gradient.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,