Article ID Journal Published Year Pages File Type
407501 Neurocomputing 2015 12 Pages PDF
Abstract

Manifold regularized sparse coding shows promising performance for various applications. The key issue that must be considered in the application is how to adaptively select the suitable graph hyper-parameters in manifold learning for the sparse coding task. Usually, cross validation is applied, but it does not necessarily scale up and easily leads to overfitting. In this article, multiple graph sparse coding (MGrSc) and multiple Hypergraph sparse coding (MHGrSc) for image representation are proposed. Inspired by the Ensemble Manifold Regularizer, we formulate multiple graph and multiple Hypergraph regularizers to guarantee the smoothness of sparse codes along the geodesics of a data manifold, which is characterized by fusing the multiple previously given graph Laplacians or Hypergraph Laplacians. Then, the proposed regularziers, respectively, are incorporated into the traditional sparse coding framework, which results in two unified objective functions of sparse coding. Alternating optimization is used to optimize the objective functions, and two, novel manifold regularized sparse coding algorithms are presented. The proposed two sparse coding methods learn both the composite manifold and the sparse coding jointly, and it is fully automatic for learning the graph hyper-parameters in the manifold learning. Image clustering tests on real world datasets demonstrated that the proposed sparse coding methods are superior to the state-of-the-art methods.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,