Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
536900 | Pattern Recognition Letters | 2006 | 4 Pages |
Statistical model of subband wavelet coefficients fitted with generalized Gaussian density (GGD) has been widely used into image retrieval, classification, segmentation, denoising, and analysis/synthesis. Moment estimation is very simple method for estimation of the GGD parameters, but it is not sharp. Maximum likelihood estimation is obtained by solving the transcendental equation. Unfortunately, the equation has not analytical solution, and has to be solved numerically. We discover that Newton–Raphson iteration converges very slowly for the transcendental equation. To speed up convergence, we present Regula–Falsi method in place of Newton–Raphson method. The experiments show that Regula–Falsi method reaches the accuracy of the order 10−7 in three iterations.