Article ID Journal Published Year Pages File Type
6868695 Computational Statistics & Data Analysis 2018 17 Pages PDF
Abstract
The risk of the phi-divergence of a statistical model for categorical data is defined using two independent sets of data. The asymptotic bias of the phi-divergence based on current data as an estimator of the risk is shown to be equal to the negative penalty term of the Akaike information criterion (AIC). Though the higher-order asymptotic bias is derived, the higher-order bias depends on the form of the phi-divergence and the estimation method of parameters using a possible different form of the phi-divergence. An approximation to the higher-order bias is obtained based on the simple result of the saturated model. The information criteria using this approximation yield improved results in simulations for model selection. Some cases of the phi-divergences show advantages over the AIC in simulations.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
,