کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
10127078 1645032 2018 20 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Low-rank and sparse embedding for dimensionality reduction
ترجمه فارسی عنوان
جای گذاری کم و ضعیف برای کاهش ابعاد
کلمات کلیدی
کاهش ابعاد، یادگیری زیرزمینی، نیرومندی، به طور کلی بهینه،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
In this paper, we propose a robust subspace learning (SL) framework for dimensionality reduction which further extends the existing SL methods to a low-rank and sparse embedding (LRSE) framework from three aspects: overall optimum, robustness and generalization. Owing to the uses of low-rank and sparse constraints, both the global subspaces and local geometric structures of data are captured by the reconstruction coefficient matrix and at the same time the low-dimensional embedding of data are enforced to respect the low-rankness and sparsity. In this way, the reconstruction coefficient matrix learning and SL are jointly performed, which can guarantee an overall optimum. Moreover, we adopt a sparse matrix to model the noise which makes LRSE robust to the different types of noise. The combination of global subspaces and local geometric structures brings better generalization for LRSE than related methods, i.e., LRSE performs better than conventional SL methods in unsupervised and supervised scenarios, particularly in unsupervised scenario the improvement of classification accuracy is considerable. Seven specific SL methods including unsupervised and supervised methods can be derived from the proposed framework and the experiments on different data sets (including corrupted data) demonstrate the superiority of these methods over the existing, well-established SL methods. Further, we exploit experiments to provide some new insights for SL.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 108, December 2018, Pages 202-216
نویسندگان
, , , , , ,