Article ID Journal Published Year Pages File Type
416650 Computational Statistics & Data Analysis 2006 18 Pages PDF
Abstract

A novel class of models for multivariate time series is presented. We consider hierarchical mixture-of-expert (HME) models in which the experts, or building blocks of the model, are vector autoregressions (VAR). It is assumed that the VAR-HME model partitions the covariate space, specifically including time as a covariate, into overlapping regions called overlays. In each overlay a given number of VAR experts compete with each other so that the most suitable one for the overlay is favored by a large weight. The weights have a particular parametric form that allows the modeler to include relevant covariates. Estimation of the model parameters is achieved via the EM (expectation–maximization) algorithm. A new algorithm to select the optimal number of overlays, the number of VAR models and the model orders of the VARs that define a particular VAR-HME model configuration, is also developed. The algorithm uses the Bayesian information criterion (BIC) as an optimality criterion. Issues of model checking and inference of latent structure in multiple time series are investigated. The new methodology is illustrated by analyzing a synthetic data set and a 7-channel electroencephalogram data set.

Keywords
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,