Article ID Journal Published Year Pages File Type
405818 Neurocomputing 2016 9 Pages PDF
Abstract

Sparse Gaussian process models provide an efficient way to perform regression on large data sets. Sparsification approaches deal with the selection of a representative subset of available training data for inducing the sparse model approximation. A variety of insertion and deletion criteria have been proposed, but they either lack accuracy or suffer from high computational costs. In this paper, we present a new and straightforward criterion for successive selection and deletion of training points in sparse Gaussian process regression. The proposed novel strategies for sparsification are as fast as the purely randomized schemes and, thus, appropriate for applications in online learning. Experiments on real-world robot data demonstrate that our obtained regression models are competitive with the computationally intensive state-of-the-art methods in terms of generalization and accuracy. Furthermore, we employ our approach in learning inverse dynamics models for compliant robot control using very large data sets, i.e. with half a million training points. In this experiment, it is also shown that our approximated sparse Gaussian process model is sufficiently fast for real-time prediction in robot control.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,