Article ID Journal Published Year Pages File Type
5097628 Journal of Econometrics 2006 34 Pages PDF
Abstract
It is standard in applied work to select forecasting models by ranking candidate models by their prediction mean squared error (PMSE) in simulated out-of-sample (SOOS) forecasts. Alternatively, forecast models may be selected using information criteria (IC). We compare the asymptotic and finite-sample properties of these methods in terms of their ability to mimimize the true out-of-sample PMSE, allowing for possible misspecification of the forecast models under consideration. We show that under suitable conditions the IC method will be consistent for the best approximating model among the candidate models. In contrast, under standard assumptions the SOOS method, whether based on recursive or rolling regressions, will select overparameterized models with positive probability, resulting in excessive finite-sample PMSEs.
Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
, ,