کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
387305 | 660900 | 2012 | 7 صفحه PDF | دانلود رایگان |
In this paper, we propose a filtering method for feature selection called ALOFT (At Least One FeaTure). The proposed method focuses on specific characteristics of text categorization domain. Also, it ensures that every document in the training set is represented by at least one feature and the number of selected features is determined in a data-driven way. We compare the effectiveness of the proposed method with the Variable Ranking method using three text categorization benchmarks (Reuters-21578, 20 Newsgroup and WebKB), two different classifiers (k-Nearest Neighbor and Naïve Bayes) and five feature evaluation functions. The experiments show that ALOFT obtains equivalent or better results than the classical Variable Ranking.
► We propose a filtering method for text categorization called ALOFT.
► The proposed approach automatically finds the optimal number of features.
► ALOFT ensures that each document contributes to the final feature vector.
► ALOFT is fast and deterministic.
► When compared with the VR algorithm, ALOFT obtains better results.
Journal: Expert Systems with Applications - Volume 39, Issue 17, 1 December 2012, Pages 12851–12857