Article ID Journal Published Year Pages File Type
407715 Neurocomputing 2015 9 Pages PDF
Abstract

Traditional manifold learning algorithms, such as Locally Linear Embedding, Isomap and Laplacian Eigenmap, only provide the embedding results of training samples. Although many extensions of these approaches try to solve the out-of-sample extension problem, their computations cannot avoid eigen-decomposition of dense matrices which is expensive in both time and memory. To solve this problem, spectral regression (SR) casts the problem of learning an embedding function into a regression framework. Motivated by the effectiveness of extreme learning machine (ELM), in this paper, we solve the out-of-sample extension problem by seeking an embedding function in ELM feature space. An extreme spectral regression (ESR) algorithm is proposed to speed up kernel-based SR (KSR) further. In addition, it is proved that ESR is an approximation of KSR. Similar to SR, the proposed ESR algorithm can be performed in supervised, unsupervised and semi-supervised situation. Experimental results on classification and semi-supervised classification demonstrate the effectiveness and efficiency of our algorithm.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,