Article ID Journal Published Year Pages File Type
493373 Procedia Technology 2012 5 Pages PDF
Abstract

The pragmatic realism of the high dimensionality incurs limitations in many pattern recognition arena such as text classification,data mining, information retrieval and face recognition. The unsupervised PCA ante up no attention to the class labels of the existing training data. LDA is not stable due to the small sample size problem and but corroborate best directions when each class has a Gaussian density with a common covariance matrix. But it can flop if the class densities are more general and interpreted class separability in between-class-matrices are inadequate. Maximum Margin Criterion (MMC) having lower computational cost, is more efficient than LDA for calculating the discriminant vectors barring the computation for inverse of within-class-scatter matrix. But traditional MMC disregards the discriminative information within the local structure of samples and performance is depended on choosing of a coefficient. In this paper we delineate the locality of data points by counting a distances among data points considering the supervised knowledge. We have computed the entire scatter matrix in Laplacian graph embedded space and finally the produced stastically uncorrelated discriminant vectors reduces redundancy among the extracted features and there is no constant to be chosen. Our experiment with Reauter dataset recommends this algorithm is more efficient than LDA, MMC and it manifests similar or sometimes better result than other locality based algorithm like LPP and LSDA.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)