کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
531795 | 869876 | 2016 | 14 صفحه PDF | دانلود رایگان |
• We developed a joint feature selection and classification method with constructive structure which can be used practically for large data sets.
• The proposed method can produce state-of-the-art performance in terms of accuracy.
• The proposed method benefits from sparser learned model both in sample and feature domains.
• The run-time of the proposed method is much less than the state-of-the-art algorithms especially for large data sets.
The recently proposed Relevance Sample-Feature Machine (RSFM) performs joint feature selection and classification with state-of-the-art performance in terms of accuracy and sparsity. However, it suffers from high computational cost for large training sets. To accelerate its training procedure, we introduce a new variant of this algorithm named Incremental Relevance Sample-Feature Machine (IRSFM). In IRSFM, the marginal likelihood maximization approach is changed such that the model learning follows a constructive procedure (starting with an empty model, it iteratively adds or omits basis functions to construct the learned model). Our extensive experiments on various data sets and comparison with various competing algorithms demonstrate the effectiveness of the proposed IRSFM in terms of accuracy, sparsity and run-time. While the IRSFM achieves almost the same classification accuracy as the RSFM, it benefits from sparser learned model both in sample and feature domains and much less training time than RSFM especially for large data sets.
Journal: Pattern Recognition - Volume 60, December 2016, Pages 835–848