کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
10151195 1666107 2018 43 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Symmetric low-rank preserving projections for subspace learning
ترجمه فارسی عنوان
پیش بینی های متقارن با کمترین سطح برای یادگیری زیرمجموعه
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی
Graph construction plays an important role in graph-oriented subspace learning. However, most existing approaches cannot simultaneously consider the global and local structures of high-dimensional data. In order to solve this deficiency, we propose a symmetric low-rank preserving projection (SLPP) framework incorporating a symmetric constraint and a local regularization into low-rank representation learning for subspace learning. Under this framework, SLPP-M is incorporated with manifold regularization as its local regularization while SLPP-S uses sparsity regularization. Besides characterizing the global structure of high-dimensional data by a symmetric low-rank representation, both SLPP-M and SLPP-S effectively exploit the local manifold and geometric structure by incorporating manifold and sparsity regularization, respectively. The similarity matrix is successfully learned by solving the nuclear-norm minimization optimization problem. Combined with graph embedding techniques, a transformation matrix effectively preserves the low-dimensional structure features of high-dimensional data. In order to facilitate classification by exploiting available labels of training samples, we also develop a supervised version of SLPP-M and SLPP-S under the SLPP framework, named S-SLPP-M and S-SLPP-S, respectively. Experimental results in face, handwriting and object recognition applications demonstrate the efficiency of the proposed algorithm for subspace learning.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 315, 13 November 2018, Pages 381-393
نویسندگان
, , , ,