Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
413002 | Neurocomputing | 2009 | 4 Pages |
Abstract
We prove that the evaluation function of variational Bayesian (VB) clustering algorithms can be described as the log likelihood of given data minus the Kullback–Leibler (KL) divergence between the prior and the posterior of model parameters. In this novel formalism of VB, the evaluation functions can be explicitly interpreted as information criteria for model selection and the KL divergence imposes a heavy penalty on the posterior far from the prior. We derive the update process of the variational Bayesian clustering with finite mixture Student's t-distribution, taking the penalty term for the degree of freedoms into account.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Takashi Takekawa, Tomoki Fukai,