کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
10326391 | 678070 | 2016 | 33 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Linear dimensionality reduction based on Hybrid structure preserving projections
ترجمه فارسی عنوان
کاهش ابعاد خطی بر اساس ساختار ترکیبی، حفظ پیش بینی ها
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
کاهش ابعاد، نمایندگی انحصاری، نمایندگی محله یادگیری ویژگی های تبعیض آمیز، ساختار داده ترکیبی،
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
چکیده انگلیسی
Recent advances have shown the methods based on local structure preserving projections can effectively learn discriminative features. The two attractive approaches for characterizing such data structure are: the classical nearest neighbor strategy for neighborhood structure and the sparse coding algorithm for sparsity structure. Motivated by the intuitive analysis of the relationship between the two structures, in this paper, we take both of them into account and propose two integrated approaches for dimensionality reduction. Concretely, we for achieving improvement directly integrate two available objectives, utilizing neighborhood structure and based on sparsity structure, to construct the combined method, briefly called CSNP. However, such rough strategy often results in its degradation in practice. Instead of the superficial combination, we exploit a hybrid structure by intergrading the two structures and then propose the Sparsity and Neighborhood Preserving Projections, dubbed SNPP, by preserving the hybrid structure into reduced subspace. The resulting optimization problems can be also interpreted as an instance of the general graph embedding framework and can reduce to the generalized eigenvalue decomposition problem. Finally, we conduct extensive experiments on publicly available data sets to verify the efficacy of our algorithms. From the experimental results, we roughly draw the conclusion that neighborhood structure is more important for low-dimensional data while sparsity structure is more useful for high-dimensional data.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 173, Part 3, 15 January 2016, Pages 518-529
Journal: Neurocomputing - Volume 173, Part 3, 15 January 2016, Pages 518-529
نویسندگان
Yupei Zhang, Ming Xiang, Bo Yang,