کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534178 | 870230 | 2012 | 8 صفحه PDF | دانلود رایگان |

More powerful mobile devices stimulate mobile visual search to become a popular and unique image retrieval application. A number of challenges come up with such application, resulting from appearance variations in mobile images. Performance of state-of-the-art image retrieval systems is improved using bag-of-words approaches. However, for visual search by mobile images with large variations, there are at least two critical issues unsolved: (1) the loss of features discriminative power due to quantization; and (2) the underuse of spatial relationships among visual words. To address both issues, this paper presents a novel visual search method based on feature grouping and local soft match, which considers properties of mobile images and couples visual and spatial information consistently. First features of the query image are grouped using both matched visual features and their spatial relationships; and then grouped features are softly matched to alleviate quantization loss. An efficient score scheme is devised to utilize inverted file index and compared with vocabulary-guided pyramid kernels. Finally experiments on Stanford mobile visual search database and a collected database with more than one million images show that the proposed method achieves promising improvement over the approach with a vocabulary tree, especially when large variations exist in query images.
► We propose a mobile visual search method based on visual and spatial consistency.
► Simple but effective feature grouping improves feature discriminative power.
► Local soft match in word space alleviates feature quantization loss.
► An efficient score scheme and index guarantee the search efficiency.
► A large mobile visual search dataset is built and released.
Journal: Pattern Recognition Letters - Volume 33, Issue 3, 1 February 2012, Pages 239–246