Article ID Journal Published Year Pages File Type
406572 Neurocomputing 2014 12 Pages PDF
Abstract

Classification is a mainstream within the machine learning community. As a result, a large number of learning algorithms have been proposed. The performance of many of these could highly depend on the chosen values of their hyper-parameters. This paper introduces a novel method for addressing the model selection problem for a given classification task. In our model selection formulation, both the learning algorithm and its hyper-parameters are considered. In our proposed approach, model selection is tackled as a multi-objective optimization problem. The empirical error, or training error, and the model complexity are defined as the objectives. We adopt a multi-objective evolutionary algorithm as the search engine, due to its high performance and its advantages for solving multi-objective problems. The model complexity is estimated experimentally, in a general fashion, for any learning algorithm, through the VC dimension. Strategies for choosing a single model or for constructing an ensemble of models from the resulting non-dominated set are also proposed. Experimental results on benchmark data sets indicate the effectiveness of the proposed approach. Furthermore, a comparative study shows that the obtained models are highly competitive, in terms of generalization performance, with other methods in the state of the art that focus on a single-learning algorithm, or a single-objective approach.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,