کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
533905 870190 2014 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Convergent Projective Non-negative Matrix Factorization with Kullback–Leibler Divergence
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله
Convergent Projective Non-negative Matrix Factorization with Kullback–Leibler Divergence
چکیده انگلیسی


• The problem of convergence for Projective Non-negative Matrix Factorization is solved.
• A new iterative formula for basis matrix is derived strictly.
• A proof of algorithm convergence is provided.
• The orthogonality and the sparseness of the basis matrix are better.
• There is higher recognition accuracy in face recognition.

In order to solve the problem of algorithm convergence in Projective Non-negative Matrix Factorization (P-NMF), a method, called Convergent Projective Non-negative Matrix Factorization with Kullback–Leibler Divergence (CP-NMF-DIV) is proposed. In CP-NMF-DIV, an objective function of Kullback–Leibler Divergence is considered. The Taylor series expansion and the Newton iteration formula of solving root are used. An iterative algorithm for basis matrix is derived, and a proof of algorithm convergence is provided. Experimental results show that the convergence speed of the algorithm is higher; relative to Non-negative Matrix Factorization (NMF), the orthogonality and the sparseness of the basis matrix are better, however the reconstructed results of data show that the basis matrix is still approximately orthogonal; in face recognition, there is higher recognition accuracy and it is stable in most cases which the ranks of the basis matrices are set with different values. The method for CP-NMF-DIV is effective.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition Letters - Volume 36, 15 January 2014, Pages 15–21
نویسندگان
, , ,