Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
387305 | Expert Systems with Applications | 2012 | 7 Pages |
In this paper, we propose a filtering method for feature selection called ALOFT (At Least One FeaTure). The proposed method focuses on specific characteristics of text categorization domain. Also, it ensures that every document in the training set is represented by at least one feature and the number of selected features is determined in a data-driven way. We compare the effectiveness of the proposed method with the Variable Ranking method using three text categorization benchmarks (Reuters-21578, 20 Newsgroup and WebKB), two different classifiers (k-Nearest Neighbor and Naïve Bayes) and five feature evaluation functions. The experiments show that ALOFT obtains equivalent or better results than the classical Variable Ranking.
► We propose a filtering method for text categorization called ALOFT. ► The proposed approach automatically finds the optimal number of features. ► ALOFT ensures that each document contributes to the final feature vector. ► ALOFT is fast and deterministic. ► When compared with the VR algorithm, ALOFT obtains better results.