Article ID Journal Published Year Pages File Type
4970416 Signal Processing: Image Communication 2017 13 Pages PDF
Abstract
We develop a new model for no-reference 3D stereopair quality assessment that considers the impact of binocular fusion, rivalry, suppression, and a reverse saliency effect on the perception of distortion. The resulting framework, dubbed the S3D INtegrated Quality (SINQ) Predictor, first fuses the left and right views of a stereopair into a single synthesized cyclopean image using a novel modification of an existing binocular perceptual model. Specifically, the left and right views of a stereopair are fused using a measure of “cyclopean” spatial activity. A simple product estimate is also calculated as the correlation between left and right disparity-corrected corresponding binocular pixels. Univariate and bivariate statistical features are extracted from the four available image sources: the left view, the right view, the synthesized “cyclopean” spatial activity image, and the binocular product image. Based on recent evidence regarding the placement of 3D fixation by subjects viewing stereoscopic 3D (S3D) content, we also deploy a reverse saliency weighting on the normalized “cyclopean” spatial activity image. Both one- and two-stage frameworks are then used to map the feature vectors to predicted quality scores. SINQ is thoroughly evaluated on the LIVE 3D image quality database (Phase I and Phase II). The experimental results show that SINQ delivers better performance than state of the art 2D and 3D quality assessment methods on six public databases, especially on asymmetric distortions.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , ,