| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 6939363 | Pattern Recognition | 2018 | 36 Pages |
Abstract
Conventional matrix completion methods are generally linear because they assume that the given data are from linear transformations of lower-dimensional latent subspace and the matrix is of low-rank. Therefore, these methods are not effective in recovering incomplete matrices when the data are from non-linear transformations of lower-dimensional latent subspace. Matrices consisting of such nonlinear data are always of high-rank or even full-rank. In this paper, a novel method, called non-linear matrix completion (NLMC), is proposed to recover missing entries of data matrices with non-linear structures. NLMC minimizes the rank (approximated by Schatten p-norm) of a matrix in the feature space given by a non-linear mapping of the data (input) space, where kernel trick is used to avoid carrying out the unknown non-linear mapping explicitly. The proposed NLMC is compared with existing methods on a toy example of matrix completion and real problems including image inpainting and single-/multi-label classification. The experimental results verify the effectiveness and superiority of the proposed method. In addition, the idea of NLMC can be extended to a non-linear rank-minimization framework applicable to other problems such as non-linear denoising.
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
Jicong Fan, Tommy W.S. Chow,
