کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
530413 | 869765 | 2014 | 11 صفحه PDF | دانلود رایگان |
• A new model, which includes GMM, LMM, and GGMM as special cases, is presented.
• The new distribution is applied for non-Gaussian and bounded support data.
• Our model is sufficiently flexible to fit different shapes of observed data.
• We propose an approach to minimize the negative log-likelihood function.
The generalized Gaussian mixture model (GGMM) provides a flexible and suitable tool for many computer vision and pattern recognition problems. However, generalized Gaussian distribution is unbounded. In many applications, the observed data are digitalized and have bounded support. A new bounded generalized Gaussian mixture model (BGGMM), which includes the Gaussian mixture model (GMM), Laplace mixture model (LMM), and GGMM as special cases, is presented in this paper. We propose an extension of the generalized Gaussian distribution in this paper. This new distribution has a flexibility to fit different shapes of observed data such as non-Gaussian and bounded support data. In order to estimate the model parameters, we propose an alternate approach to minimize the higher bound on the data negative log-likelihood function. We quantify the performance of the BGGMM with simulations and real data.
Journal: Pattern Recognition - Volume 47, Issue 9, September 2014, Pages 3132–3142