Article ID Journal Published Year Pages File Type
4947493 Neurocomputing 2017 35 Pages PDF
Abstract
Time series forecasting is of fundamental importance in big data analysis. The prediction of noisy, non-stationary, and chaotic time series demands good generalization from small amounts of data. Vapnik showed that the total risk is dependent on both, empirical error as well as model complexity, where the latter may be measured in terms of the Vapnik-Chervonenkis (VC) dimension. In other words, good generalization requires minimizing model complexity. The recently proposed Minimal Complexity Machine (MCM) has been shown to minimize a tight bound on the VC dimension, and has further been extended to Minimal Complexity Machine Regression (MCMR). In this paper, we present an original approach based on the MCM regressor, which builds sparse and accurate models for short-term time series forecasting. Results on a number of datasets establish that the proposed approach is superior to a number of state-of-the-art methods, and yields sparse models. These sparse models are able to extract only the most important information present in the data sets, thereby achieving high accuracy. Sparsity in time series forecasting models is also important in reducing the evaluation time. This assumes importance when the models need to be evaluated in real time, such as when they are used as part of trading flows.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,