Article ID Journal Published Year Pages File Type
408955 Neurocomputing 2016 11 Pages PDF
Abstract

Most existing dimensionality reduction algorithms have two disadvantages: their computational cost is high and they cannot estimate the intrinsic dimension of the original dataset by themselves. To deal with these problems, in this paper we propose a fast linear dimensionality reduction method named Orthogonal Component Analysis (OCA). While avoiding solving eigenproblem and matrix inverse problem, OCA successfully achieves high-speed orthogonal component extraction. By proposing an adaptive threshold scheme, OCA is able to estimate the dimension of the feature space automatically. Meanwhile, the algorithm is guaranteed to be numerical stable. In the experiments, OCA is compared with several typical dimensionality reduction algorithms. The experimental results demonstrate that as a universal algorithm, OCA is efficient and effective.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,