Article ID Journal Published Year Pages File Type
406043 Neurocomputing 2015 12 Pages PDF
Abstract

This paper proposes a novel automatic model selection algorithm for learning Gaussian mixtures. Unlike EM, we shall further increase the negative entropy of the posterior of latent variables to exert an indirect effect on model selection. The increase of negative entropy can be interpreted as a competition, which corresponds to an annihilation of those components with insufficient data to support. More importantly, this competition only depends on the data itself. Additionally, we seamlessly integrate parameter estimation and model selection into a single algorithm, which can be applied to any kind of parametric mixture model solved by an EM algorithm. Experiments involving Gaussian mixtures show the effectiveness of our approach on model selection.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,