کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
406286 | 678076 | 2015 | 12 صفحه PDF | دانلود رایگان |
• This paper considers a subspace learning problem in the presence of corruptions.
• The proposed method finds a robust solution using orthogonality and smoothness constraints.
• The proposed method can handle missing or unknown entries as well as outliers.
• The proposed method is extended to handle the rank uncertainty issue.
• We demonstrate that the proposed method is robust for various subspace learning problems.
Low-rank matrix factorization plays an important role in the areas of pattern recognition, computer vision, and machine learning. Recently, a new family of methods, such as l1-norm minimization and robust PCA, has been proposed for low-rank subspace analysis problems and has shown to be robust against outliers and missing data. But these methods suffer from heavy computation loads and can fail to find a solution when highly corrupted data are presented. In this paper, a robust orthogonal matrix approximation method using fixed-rank factorization is proposed. The proposed method finds a robust solution efficiently using orthogonality and smoothness constraints. The proposed method is also extended to handle the rank uncertainty issue by a rank estimation strategy for practical real-world problems. The proposed method is applied to a number of low-rank matrix approximation problems and experimental results show that the proposed method is highly accurate, fast, and efficient compared to the existing methods.
Journal: Neurocomputing - Volume 167, 1 November 2015, Pages 218–229