Article ID Journal Published Year Pages File Type
565057 Signal Processing 2006 6 Pages PDF
Abstract

The Kullback information criterion, KIC and its univariate bias-corrected version, KICcKICc may be viewed as estimators of the expected Kullback–Leibler symmetric divergence. This correspondence examines the overfitting properties of KIC and KICcKICc through the probabilities of overfitting both in finite samples and asymptotically. It is shown that KIC and KICcKICc have much smaller probabilities of overfitting than the Akaike information criterion, AIC, and its bias-corrected version AICcAICc.

Related Topics
Physical Sciences and Engineering Computer Science Signal Processing
Authors
,