| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 10525867 | Statistical Methodology | 2005 | 18 Pages |
Abstract
For many situations, the predictive ability of a candidate model is its most important attribute. In light of our interest in this property, we introduce a new cross validation model selection criterion, the predictive divergence criterion (PDC), together with a description of the target discrepancy upon which it is based. In the linear regression framework, we then develop an adjusted cross validation model selection criterion (PDCa) which serves as the minimum variance unbiased estimator of this target discrepancy. Furthermore, we show that this adjusted criterion is asymptotically a minimum variance unbiased estimator of the Kullback-Leibler discrepancy which serves as the basis for the Akaike information criteria AIC and AICc.
Related Topics
Physical Sciences and Engineering
Mathematics
Statistics and Probability
Authors
Simon L. Davies, Andrew A. Neath, Joseph E. Cavanaugh,
