کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6865474 679032 2016 22 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Sparse preserving feature weights learning
ترجمه فارسی عنوان
صرفه جویی در حفظ ویژگی های وزن گیری یادگیری
کلمات کلیدی
انتخاب ویژگی مشترک، نمایندگی انحصاری، ویژگی های وزن گیری یادگیری،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
In this paper, we propose a novel unsupervised feature selection algorithm, named sparse preserving feature weights learning (SPFW), which is based on the recent local data representation theory, sparse representation. SPFW differs from traditional feature selection algorithms in two aspects: (1) SPFW is designed on the locality measurement criterion with sparse reconstruction residual minimization. It adaptively determines the locality based on sparse representation, instead of fixing the k-nearest neighbors in the original feature space. (2) SPFW selects the most discriminative feature subset from the whole feature set in batch mode, instead of selecting features individually. To optimize the proposed formulation, we propose an efficient iterative algorithm, where each iteration reduces to a subproblem which can be solved with some off-the-shelf toolboxes. We conduct experiments on two face datasets to evaluate the performance of feature selection in terms of classification and clustering, which demonstrate the effectiveness of the proposed algorithm.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 185, 12 April 2016, Pages 45-52
نویسندگان
, , ,