Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10362202 | Pattern Recognition Letters | 2005 | 9 Pages |
Abstract
Feature selection is a fundamental process in many classifier design problems. However, it is NP-complete and approximate approaches often require requires extensive exploration and evaluation. This paper describes a novel approach that represents feature selection as a continuous regularization problem which has a single, global minimum, where the model's complexity is measured using a 1-norm on the parameter vector. A new exploratory design process is also described that allows the designer to efficiently construct the complete locus of sparse, kernel-based classifiers. It allows the designer to investigate the optimal parameters' trajectories as the regularization parameter is altered and look for effects, such as Simpson's paradox, that occur in many multivariate data analysis problems. The approach is demonstrated on the well-known Australian Credit data set.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
M. Brown, N.P. Costen,