کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
532038 | 869898 | 2015 | 12 صفحه PDF | دانلود رایگان |
• A nonlocal reconstruction-based bottom-up saliency estimation method is proposed.
• Nonlocal image processing is employed to provide a better treatment of textures.
• Sparsity pursuit is very effective for removing duplicated similar image regions.
• Reconstruction residual is inherently coherent with the bottom-up saliency.
• Our method is stable under three public datasets and consistent with human fixations.
Many saliency models consider the feature extraction as the algorithmic core and the performance of their methods relies on the selection of the features to a great extent. However, there can hardly be a set of features effective to pop out the salient regions under various visual environments. Moreover, because saliency is not tuned to certain visual features, a location winning the spatial competition in any feature space can be defined as salient. Instead of seeking for or learning the features to highlight the difference between the salient areas and the background, we focus more on the sparsity and uniqueness carried by the original image itself, the source of all the features, to propose a nonlocal reconstruction-based saliency model. In the proposed approach, the saliency is measured by the sparse reconstruction residual of representing the central patch with a linear combination of its surrounding patches sampled in a nonlocal manner. In addition, this is generalized to model the global aspect saliency, which provides a complement to the nonlocal saliency and improves the performance further. As a generalization of Itti et al.׳s classical center–surround comparison scheme, the proposed approach performs well on images where Itti et al.׳s method fails, as well as on general natural images. Numerical experiments show the proposed approach produces better results compared with the state-of-the-art algorithms on three public databases.
Journal: Pattern Recognition - Volume 48, Issue 4, April 2015, Pages 1337–1348