Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4948282 | Neurocomputing | 2016 | 9 Pages |
Abstract
Product quantization and its variances have emerged in approximate nearest neighbor search, with a wide range of applications. However, the optimized division of product subspaces retains as an open problem that largely degenerates the retrieval accuracy. In the paper, an extremely optimized product quantization scheme is introduced, which ensures, both theoretically and experimentally, a much better subspace partition comparing to the existing state-of-the-arts PQ and OPQ. The key innovation is to formulate subspace partition as a graph-based optimization problem, by which dynamic programming is leveraged to pursuit optimal quantizer learning. Another advantage is that the proposed scheme is very easily integrated with the cutting-edge multi-indexing structure, with a nearly eligible overhead in addition. We have conducted a serial of large-scale quantitative evaluations, with comparisons to a group of recent works including PQ, OPQ, and multi-Index. We have shown superior performance gain in the widely used SIFT1B benchmark, which validates the merits of the proposed algorithm.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Yuanzheng Cai, Rongrong Ji, Shaozi Li,