Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1032452 | Omega | 2016 | 7 Pages |
•Constrained subspace classifier (CSC) is proposed for high dimensional datasets.•CSC appears to be a robust classifier compared to traditional two-step methods.•An efficient alternating optimization technique is also proposed.•CSC can serve as a one-step method for preprocessing-free classification.
Datasets with significantly larger number of features, compared to samples, pose a serious challenge in supervised learning. Such datasets arise in various areas including business analytics. In this paper, a new binary classification method called constrained subspace classifier (CSC) is proposed for such high dimensional datasets. CSC improves on an earlier proposed classification method called local subspace classifier (LSC) by accounting for the relative angle between subspaces while approximating the classes with individual subspaces. CSC is formulated as an optimization problem and can be solved by an efficient alternating optimization technique. Classification performance is tested in publicly available datasets. The improvement in classification accuracy over LSC shows the importance of considering the relative angle between the subspaces while approximating the classes. Additionally, CSC appears to be a robust classifier, compared to traditional two step methods that perform feature selection and classification in two distinct steps.