Article ID Journal Published Year Pages File Type
10370399 Signal Processing 2005 13 Pages PDF
Abstract
A new log-likelihood (LL) based metric for goodness-of-fit testing and monitoring unsupervised learning of mixture densities is introduced, called differential LL. We develop the metric in the case of a Gaussian kernel fitted to a Gaussian distribution. We suggest a possible differential LL learning strategy, show the formal link with the Kullback-Leibler divergence and the quantization error, and introduce a Gaussian factorial distribution approximation by subspaces.
Related Topics
Physical Sciences and Engineering Computer Science Signal Processing
Authors
,