Article ID Journal Published Year Pages File Type
534962 Pattern Recognition Letters 2009 11 Pages PDF
Abstract

In this paper we present a necessary and sufficient condition for global optimality of unsupervised Learning Vector Quantization (LVQ) in kernel space. In particular, we generalize the results presented for expansive and competitive learning for vector quantization in Euclidean space, to the general case of a kernel-based distance metric. Based on this result, we present a novel kernel LVQ algorithm with an update rule consisting of two terms: the former regulates the force of attraction between the synaptic weight vectors and the inputs; the latter, regulates the repulsion between the weights and the center of gravity of the dataset. We show how this algorithm pursues global optimality of the quantization error by means of the repulsion mechanism. Simulation results are provided to show the performance of the model on common image quantization tasks: in particular, the algorithm is shown to have a superior performance with respect to recently published quantization models such as Enhanced LBG [Patané, G., Russo, M., 2001. The enhanced LBG algorithm. Neural Networks 14 (9), 1219–1237] and Adaptive Incremental LBG [Shen, F., Hasegawa, O., 2006. An adaptive incremental LBG for vector quantization. Neural Networks 19 (5), 694–704].

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,