Article ID Journal Published Year Pages File Type
406286 Neurocomputing 2015 12 Pages PDF
Abstract

•This paper considers a subspace learning problem in the presence of corruptions.•The proposed method finds a robust solution using orthogonality and smoothness constraints.•The proposed method can handle missing or unknown entries as well as outliers.•The proposed method is extended to handle the rank uncertainty issue.•We demonstrate that the proposed method is robust for various subspace learning problems.

Low-rank matrix factorization plays an important role in the areas of pattern recognition, computer vision, and machine learning. Recently, a new family of methods, such as l1-norm minimization and robust PCA, has been proposed for low-rank subspace analysis problems and has shown to be robust against outliers and missing data. But these methods suffer from heavy computation loads and can fail to find a solution when highly corrupted data are presented. In this paper, a robust orthogonal matrix approximation method using fixed-rank factorization is proposed. The proposed method finds a robust solution efficiently using orthogonality and smoothness constraints. The proposed method is also extended to handle the rank uncertainty issue by a rank estimation strategy for practical real-world problems. The proposed method is applied to a number of low-rank matrix approximation problems and experimental results show that the proposed method is highly accurate, fast, and efficient compared to the existing methods.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,