Article ID Journal Published Year Pages File Type
416876 Computational Statistics & Data Analysis 2006 27 Pages PDF
Abstract

The evaluation of an Akaike information criterion (AIC), KIC is considered. Kullback information criterion (KIC) is an approximately unbiased estimator for a risk function based on the Kullback's symmetric divergence. However, when the sample size is small, or when it is large and the dimension of the candidate model is relatively small, this criterion displays a large negative bias. To overcome this problem, corrected versions, KICc, of this criterion for univariate autoregressive models and for multiple and multivariate regression models are proposed. Thus, the methodology for AIC and AICc from McQuarrie and Tsai is extended to the KIC criterion. The performance of the new criterion relative to other criteria is examined in a large simulation study.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,