Article ID Journal Published Year Pages File Type
6870699 Computational Statistics & Data Analysis 2013 16 Pages PDF
Abstract
Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels make them widely applicable. Therefore, their use is recommended instead of the popular polynomial kernels in general settings, where no information on the data-generating process is available.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
,