Article ID Journal Published Year Pages File Type
4633146 Applied Mathematics and Computation 2008 9 Pages PDF
Abstract

Gaussian mixture model has been used extensively in the fields of information processing and data analysis. However, its model selection, i.e., the selection of number of components or Gaussians in the mixture, is still a difficult problem. Fortunately, the new established Bayesian Ying–Yang (BYY) harmony function provides an efficient criterion for the model selection of Gaussian mixture with a set of sample data. In this paper, we propose a BYY scale-incremental EM algorithm for Gaussian mixture learning via a component split rule to increase the BYY harmony function incrementally. Particularly, starting from two components and adding one component sequentially via the split rule after each EM procedure until a maximum number of components, the algorithm increases the scale of the mixture incrementally and leads to the maximization of the BYY harmony function, together with the correct model selection and a good parameter estimation of the Gaussian mixture. It is demonstrated well by the simulation experiments that this BYY scale-incremental EM algorithm can make both model selection and parameter estimation efficiently for Gaussian mixture modeling. Moreover, the BYY scale-incremental EM algorithm is successfully applied to two real-life data sets, including Iris data classification and unsupervised color image segmentation.

Related Topics
Physical Sciences and Engineering Mathematics Applied Mathematics
Authors
, ,