Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6866787 | Neurocomputing | 2014 | 10 Pages |
Abstract
This paper proposes two new training algorithms for multilayer perceptrons based on evolutionary computation, regularization, and transduction. Regularization is a commonly used technique for preventing the learning algorithm from overfitting the training data. In this context, this work introduces and analyzes a novel regularization scheme for neural networks (NNs) named eigenvalue decay, which aims at improving the classification margin. The introduction of eigenvalue decay led to the development of a new training method based on the same principles of SVM, and so named Support Vector NN (SVNN). Finally, by analogy with the transductive SVM (TSVM), it is proposed a transductive NN (TNN), by exploiting SVNN in order to address transductive learning. The effectiveness of the proposed algorithms is evaluated on seven benchmark datasets.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Oswaldo Ludwig, Urbano Nunes, Rui Araujo,