Article ID Journal Published Year Pages File Type
10326526 Neural Networks 2008 9 Pages PDF
Abstract
The learning process of a multilayer perceptron requires the optimization of an error function E(y,t) comparing the predicted output, y, and the observed target, t. We review some usual error functions, analyze their mathematical properties for data classification purposes, and introduce a new one, EExp, inspired by the Z-EDM algorithm that we have recently proposed. An important property of EExp is its ability to emulate the behavior of other error functions by the sole adjustment of a real-valued parameter. In other words, EExp is a sort of generalized error function embodying complementary features of other functions. The experimental results show that the flexibility of the new, generalized, error function allows one to obtain the best results achievable with the other functions with a performance improvement in some cases.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,