Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1155185 | Statistics & Probability Letters | 2008 | 8 Pages |
Abstract
This note provides a proof of a fundamental assumption in the verification of bootstrap AIC variants in mixed models. The assumption links the bootstrap data and the original sample data via the log-likelihood function, and is the key condition used in the validation of the criterion penalty terms. (See Assumption 3 of both Shibata [Shibata, R., 1997. Bootstrap estimate of Kullback-Leibler information for model selection. Statistica Sinica 7, 375-394] and Shang and Cavanaugh [Shang, J., Cavanaugh, J.E., 2008. Bootstrap variants of the Akaike information criterion for mixed model selection. Computational Statistics and Data Analysis 52, 2004-2021]. To state the assumption, let Y and Yâ represent the response vector and the corresponding bootstrap sample, respectively. Let θ represent the set of parameters for a candidate mixed model, and let Î¸Ë denote the corresponding maximum likelihood estimator based on maximizing the likelihood L(θâ£Y). With Eâ denoting the expectation with respect to the bootstrap distribution of Yâ, the assumption asserts that EâlogL(θËâ£Yâ)=logL(θËâ£Y). We prove that the assumption holds under parametric, semiparametric, and nonparametric bootstrapping.
Related Topics
Physical Sciences and Engineering
Mathematics
Statistics and Probability
Authors
Junfeng Shang, Joseph E. Cavanaugh,