Article ID Journal Published Year Pages File Type
409440 Neurocomputing 2006 4 Pages PDF
Abstract

As for Gaussian mixture modeling, the key problem is to select the number of Gaussians in the mixture. Based on regularization theory, we aim to make this kind of model selection by implementing an iterative algorithm for entropy regularized likelihood (ERL) learning on Gaussian mixture. The simulation experiments have demonstrated that the ERL algorithm can automatically detect the number of Gaussians with a good estimation of the parameters in the original mixture, even on a sample set with a high degree of overlap. Moreover, the ERL algorithm also leads to a promising result when applied to the classification of iris data.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,