کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
405922 678048 2016 17 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Robust mixture of experts modeling using the tt distribution
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Robust mixture of experts modeling using the tt distribution
چکیده انگلیسی

Mixture of Experts (MoE) is a popular framework for modeling heterogeneity in data for regression, classification, and clustering. For regression and cluster analyses of continuous data, MoE usually uses normal experts following the Gaussian distribution. However, for a set of data containing a group or groups of observations with heavy tails or atypical observations, the use of normal experts is unsuitable and can unduly affect the fit of the MoE model. We introduce a robust MoE modeling using the tt distribution. The proposed tt MoE (TMoE) deals with these issues regarding heavy-tailed and noisy data. We develop a dedicated expectation–maximization (EM) algorithm to estimate the parameters of the proposed model by monotonically maximizing the observed data log-likelihood. We describe how the presented model can be used in prediction and in model-based clustering of regression data. The proposed model is validated on numerical experiments carried out on simulated data, which show the effectiveness and the robustness of the proposed model in terms of modeling non-linear regression functions as well as in model-based clustering. Then, it is applied to the real-world data of tone perception for musical data analysis, and the one of temperature anomalies for the analysis of climate change data. The obtained results show the usefulness of the TMoE model for practical applications.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 79, July 2016, Pages 20–36
نویسندگان
,