کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
6903530 1446991 2018 12 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Mixture of latent multinomial naive Bayes classifier
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر نرم افزارهای علوم کامپیوتر
پیش نمایش صفحه اول مقاله
Mixture of latent multinomial naive Bayes classifier
چکیده انگلیسی
Naive Bayes classifier has been extensively applied in various domains in the past few decades due to its simple structure and remarkable predictive performance. However, it is based on a strong assumption which confines its usage for many real-world applications; conditional independence of attributes given class information. In this paper, we propose mixture of latent multinomial naive Bayes (MLMNB) classifier as an extension of naive Bayes to relax the independence assumption. MLMNB incorporates a latent variable in a predefined Bayesian network structure to model the dependencies among attributes, yet avoids burden complexities of structural learning approaches. We theoretically prove that MLMNB automatically shrinks to naive Bayes classifier whenever conditional independence assumption holds. Expectation-maximization (EM) algorithm is modified for the parameter estimation. The experimental results on 36 datasets from the University of California, Irvine (UCI) machine learning repository show that MLMNB achieves a substantial predictive performance as compared with the state-of-the-art modifications of naive Bayes classifier, in terms of classification accuracy (ACC), conditional log-likelihood (CLL), and area under the ROC curve (AUC).
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Applied Soft Computing - Volume 69, August 2018, Pages 516-527
نویسندگان
, ,