Article ID Journal Published Year Pages File Type
405069 Neural Networks 2006 11 Pages PDF
Abstract

This study presents a new vector quantization method that generates codewords incrementally. New codewords are inserted in regions of the input vector space where the distortion error is highest until the desired number of codewords (or a distortion error threshold) is achieved. Adoption of the adaptive distance function greatly increases the proposed method's performance. During the incremental process, a removal–insertion technique is used to fine-tune the codebook to make the proposed method independent of initial conditions. The proposed method works better than some recently published efficient algorithms such as Enhanced LBG (Patane, & Russo, 2001) for traditional tasks: with fixed number of codewords, to find a suitable codebook to minimize distortion error. The proposed method can also be used for new tasks that are insoluble using traditional methods: with fixed distortion error, to minimize the number of codewords and find a suitable codebook. Experiments for some image compression problems indicate that the proposed method works well.

Keywords
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,