Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
534293 | Pattern Recognition Letters | 2014 | 7 Pages |
•Two on-line approaches for Gaussian process (GP) regression are proposed.•The 1st approach (RGP) is computationally cheap with errors close to a full GP.•The 2nd approach (RGP★) in addition performs on-line hyperparameter learning.•RGP★ can outperform even off-line learning algorithms in terms of error.•RGP★ is computationally cheaper than other on-line learners with lower error.
Two approaches for on-line Gaussian process regression with low computational and memory demands are proposed. The first approach assumes known hyperparameters and performs regression on a set of basis vectors that stores mean and covariance estimates of the latent function. The second approach additionally learns the hyperparameters on-line. For this purpose, techniques from nonlinear Gaussian state estimation are exploited. The proposed approaches are compared to state-of-the-art sparse Gaussian process algorithms.