کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
416787 | 681399 | 2013 | 17 صفحه PDF | دانلود رایگان |

Semi-supervised classification can help to improve generative classifiers by taking into account the information provided by the unlabeled data points, especially when there are far more unlabeled data than labeled data. The aim is to select a generative classification model using both unlabeled and labeled data. A predictive deviance criterion, AICcond, aiming to select a parsimonious and relevant generative classifier in the semi-supervised context is proposed. In contrast to standard information criteria such as AIC and BIC, AICcond is focused on the classification task, since it attempts to measure the predictive power of a generative model by approximating its predictive deviance. However, it avoids the computational cost of cross-validation criteria, which make repeated use of the EM algorithm. AICcond is proved to have consistency properties that ensure its parsimony when compared with the Bayesian Entropy Criterion (BEC), whose focus is similar to that of AICcond. Numerical experiments on both simulated and real data sets show that the behavior of AICcond as regards the selection of variables and models, is encouraging when it is compared to the competing criteria.
Journal: Computational Statistics & Data Analysis - Volume 64, August 2013, Pages 220–236