Article ID Journal Published Year Pages File Type
412403 Neurocomputing 2013 14 Pages PDF
Abstract

For decades, subspace learning has received considerable interests in the pattern recognition and computer vision communities. Many promising methods have emerged to capture a better subspace from different perspectives. As a popular learning paradigm, matrix factorization is actively utilized to learn a new subspace from high-dimensional data space. Very recently, some work attempts to consider the decomposed matrix from a statistical point of view, which models the data points via ridge regression and minimizes the variance of the parameter. However, they neglect the structured information embedded in the local neighborhoods of each data point and fail to exploit the prior knowledge. To address these problems, we present a novel subspace learning approach named Locally Constrained A-optimal nonnegative projection, termed as LCA in short. This method strives to preserve the locally geometrical structure of the obtained subspace via neighborhood patches while projecting the nonnegative data points with the high dimension onto a low-dimensional subspace. Besides, we incorporate some supervised information as constraints to guide subspace learning, such that the discriminating power of the new subspace can be much more strengthened. Therefore, the column vectors derived from the nonnegative projection span a new subspace that characterizes local consistency and better discriminative ability. The favorable experimental results have verified the effectiveness of the proposed approach compared to some competitive methods.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,