Article ID Journal Published Year Pages File Type
407277 Neurocomputing 2016 16 Pages PDF
Abstract

High dimensional data is hard to interpret and work with in its raw form; hence dimensionality reduction is applied beforehand to discover underlying low dimensional manifold. Locality Preserving Projection (LPP) was introduced using the concept that neighboring data points in the high dimensional space should remain neighbors in the low dimensional space as well. In a typical pattern recognition problem, true neighbors are defined as the patterns belonging to same class. Ambiguities in regions having data points from different classes close by, less reducibility capacity and data dependent parameters are some of the issues with conventional LPP. In this article, some of the variants of LPP have been introduced that try to resolve these problems. A weighing function that tunes the parameters depending on data and takes care of the other issues is used in Extended version of LPP (ELPP). Better class discrimination is obtained using the concept of intra and inter-class distance in a supervised variant (ESLPP-MD). To capture the non-linearity of the data, Kernel based variants are used, that first map the data to feature space. Data representation, clustering, face and facial expression recognition performances are reported on a large set of databases.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,