Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4946450 | Knowledge-Based Systems | 2016 | 23 Pages |
Abstract
The traditional Gaussian process (GP) regression is often deteriorated when the data set is large-scale and/or non-stationary. To address these challenging data properties, we propose a K-Nearest-Neighbor-based Kalman filter for Gaussian process regression (KNN-KFGP). Firstly, we design a test-input-driven KNN mechanism to group the training set into a number of small collections. Secondly, we use the latent function values of these collections as the unknown states and then construct a novel state space model with GP prior. Thirdly, we explore Kalman filter on this state space model to efficiently filter out the latent function values for prediction. As a result, our KNN-KFGP framework can effectively alleviate the heavy computation load of GP with recursive Bayesian inference, especially when the data set is large-scale. Moreover, our KNN mechanism helps each test point to find its strongly-correlated local training subset, and thus our KNN-KFGP can model non-stationarity in a flexible manner. Finally, we compare our KNN-KFGP to several related works and show its superior performance on a number of synthetic and real-world data sets.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Yali Wang, Brahim Chaib-draa,