Article ID Journal Published Year Pages File Type
10362345 Signal Processing: Image Communication 2005 27 Pages PDF
Abstract
Gauss mixtures have gained popularity in statistics and statistical signal processing applications for a variety of reasons, including their ability to well approximate a large class of interesting densities and the availability of algorithms such as the Baum-Welch or expectation-maximization (EM) algorithm for constructing the models based on observed data. We here consider a quantization approach to Gauss mixture design based on the information theoretic view of Gaussian sources as a “worst case” for robust signal compression. Results in high-rate quantization theory suggest distortion measures suitable for Lloyd clustering of Gaussian components based on a training set of data. The approach provides a Gauss mixture model and an associated Gauss mixture vector quantizer which is locally robust. We describe the quantizer mismatch distortion and its relation to other distortion measures including the traditional squared error, the Kullback-Leibler (relative entropy) and minimum discrimination information, and the log-likehood distortions. The resulting Lloyd clustering algorithm is demonstrated by applications to image vector quantization, texture classification, and North Atlantic pipeline image classification.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , ,