Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6940119 | Pattern Recognition Letters | 2018 | 6 Pages |
Abstract
In this paper, we study the connections between Rényi entropy PCA, kernel learning and graph embedding. A natural complementary formulation of maximum entropy PCA, namely minimum error entropy PCA, is presented. These two formulations can be combined together to give a two-fold understanding of Rényi entropy PCA. Further, we establish connections between Rényi entropy PCA, kernel learning and graph embedding, and propose a generalized graph embedding framework that unifies a variety of existing algorithms. This proposed framework essentially covers previous graph embedding framework, and partially answers the problem of how to make use of high order statistics of data in dimensionality reduction. The theoretic development enables a close relationship between information theoretic learning, kernel learning and graph embedding.
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
Zhi-Yong Ran, Wei Wang, Bao-Gang Hu,