Article ID Journal Published Year Pages File Type
6941579 Signal Processing: Image Communication 2018 11 Pages PDF
Abstract
The motion of moving objects in turbulence-distorted videos is affected by both the atmospheric turbulence fluctuations and the objects' own movements. Therefore, simultaneous video stabilization and preservation of the moving objects is a challenging task. Using the monogenic signal, we propose a fast two-stage approach to mitigate these erratic motions in videos and preserve the moving objects. In the first stage, each frame of a video is represented by the monogenic signal, which is used to model the turbulence-induced random brightness scintillation and local wiggles through local monogenic amplitude and phase. Then, a low-pass filter is employed to attenuate the high spatial frequency of local monogenic amplitude and phase variations to remove turbulence-induced distortions and obtain stable background frames. In the second stage, a two-step mask generating scheme is proposed to preserve the moving objects. Firstly, the coarse masks of the moving objects are obtained using the difference images between distorted frames and stable background frames. The coarse masks are then refined through analyzing the difference images of the local monogenic amplitude and phase between distorted frames and stable background frames. Finally, the stable video frames containing moving objects are reconstructed from the refined masks and the stable background frames. Experimental results show that the proposed approach is efficient and provides stabilized video and preservation of moving objects simultaneously in atmosphere turbulent conditions.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , ,