Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1147594 | Journal of Statistical Planning and Inference | 2012 | 11 Pages |
Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown true model and an estimated model. Instead of bias estimation, a bias reduction based on jackknife type procedure is developed in this paper. The jackknife method selects a model of minimum Kullback–Leibler divergence through bias reduction. It is shown that (a) the jackknife maximum likelihood estimator is consistent, (b) the jackknife estimate of the log likelihood is asymptotically unbiased, and (c) the stochastic order of the jackknife log likelihood estimate is O(loglogn). Because of these properties, the jackknife information criterion is applicable to problems of choosing a model from separated families especially when the true model is unknown. Compared to popular information criteria which are only applicable to nested models such as regression and time series settings, the jackknife information criterion is more robust in terms of filtering various types of candidate models in choosing the best approximating model.
► We define jackknife information criterion and explore its statistical properties. ► Jackknife type procedures seek bias reduction, not bias estimation as in AIC and BIC. ► Jackknife maximum likelihood estimator is consistent. ► Jackknife log likelihood estimate is asymptotically unbiased and O(loglogn). ► JIC selects a best model among various candidates from different families.