کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
720226 | 1461229 | 2014 | 6 صفحه PDF | دانلود رایگان |
In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method.
Journal: The Journal of China Universities of Posts and Telecommunications - Volume 21, Issue 2, April 2014, Pages 98-103