Article ID Journal Published Year Pages File Type
4969747 Pattern Recognition 2017 12 Pages PDF
Abstract
The dimensionality reduction methods based on linear embedding, such as neighborhood preserving embedding (NPE), sparsity preserving projections (SPP) and collaborative representation based projections (CRP), try to preserve a certain kind of linear representation for each sample after projection. However, in the transformed low-dimensional space, the linear relationship between the samples may be changed, which cannot make the linear representation-based classifiers, such as sparse representation-based classifier (SRC), to achieve higher recognition accuracy. In this paper, we propose a new linear dimensionality reduction algorithm, called Regularized Coplanar Discriminant Analysis (RCDA) to address this problem. It simultaneously seeks a linear projection matrix and some linear representation coefficients that make the samples from the same class coplanar and the samples from different classes not coplanar. The proposed regularization term balances the bias from the optimal linear representation and that from the class mean to avoid overfitting the training data, and overcomes the matrix singularity in solving the linear representation coefficients. An alternative optimization approach is proposed to solve the RCDA model. Experiments are done on several benchmark face databases and hyperspectral image databases, and results show that RCDA can obtain better performance than other dimensionality reduction methods.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,