Article ID Journal Published Year Pages File Type
8146959 Infrared Physics & Technology 2015 8 Pages PDF
Abstract
Most of available image fusion approaches cannot achieve higher spatial resolution than the multisource images. In this paper we propose a novel simultaneous images super-resolution and fusion approach via the recently developed compressed sensing and multiscale dictionaries learning technology. Under the sparse prior of image patches and the framework of compressed sensing, multisource images fusion is reduced to a task of signal recovery from the compressive measurements. Then a set of multiscale dictionaries are learned from some groups of example high-resolution (HR) image patches via a nonlinear optimization algorithm. Moreover, a linear weights fusion rule is advanced to obtain the fused high-resolution image at each scale. Finally the high-resolution image is derived by performing a low-rank decomposition on the recovered high-resolution images at multiple scales. Some experiments are taken to investigate the performance of our proposed method, and the results prove its superiority to the counterparts.
Related Topics
Physical Sciences and Engineering Physics and Astronomy Atomic and Molecular Physics, and Optics
Authors
, ,