Article ID Journal Published Year Pages File Type
384156 Expert Systems with Applications 2012 11 Pages PDF
Abstract

The process of placing a separating hyperplane for data classification is normally disconnected from the process of selecting the features to use. An approach for feature selection that is conceptually simple but computationally explosive is to simply apply the hyperplane placement process to all possible subsets of features, selecting the smallest set of features that provides reasonable classification accuracy. Two ways to speed this process are (i) use a faster filtering criterion instead of a complete hyperplane placement, and (ii) use a greedy forward or backwards sequential selection method. This paper introduces a new filtering criterion that is very fast: maximizing the drop in the sum of infeasibilities in a linear-programming transformation of the problem. It also shows how the linear programming transformation can be applied to reduce the number of features after a separating hyperplane has already been placed while maintaining the separation that was originally induced by the hyperplane. Finally, a new and highly effective integrated method that simultaneously selects features while placing the separating hyperplane is introduced.

► New feature selection filtering criterion that uses linear programming introduced. ► Can be applied before or after hyperplane placement, or integrated with it. ► Applying after hyperplane placement preserves the training hyperplane separation. ► The methods are fast and effective. ► The integrated method improves accuracy and reduces features the most.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,