Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
8901612 | Journal of Computational and Applied Mathematics | 2019 | 13 Pages |
Abstract
With the appearance of approach named “robust alignment by sparse and low-rank decomposition” (RASL), a number of linearly correlated images can be accurately and robustly aligned despite significant corruptions and occlusions. It has been discovered that this aligning task can be characterized as a sequence of 3-block convex minimization problems which can be solved efficiently by the accelerated proximal gradient method (APG), or alternatively, by the directly extended alternating direction method of multipliers (ADMM). However, the directly extended ADMM may diverge although it often performs well in numerical computations. Ideally, one should find an algorithm which can have both theoretical guarantee and superior numerical efficiency over the directly extended ADMM. We achieve this goal by using the intelligent symmetric Gauss-Seidel iteration based ADMM (sGS-ADMM) which only needs to update one of the variables twice, but surprisingly, it leads to the desired convergence to be guaranteed. The convergence of sGS-ADMM can be followed directly by relating it to the classical 2-block ADMM and with a couple of specially designed semi-proximal terms. Beyond this, we also add a rank-correction term to the model with the purpose of deriving the alignment results with higher accuracy. The numerical experiments over a wide range of realistic misalignments demonstrate that sGS-ADMM is at least two times faster than RASL and APG for the vast majority of the tested problems.
Related Topics
Physical Sciences and Engineering
Mathematics
Applied Mathematics
Authors
Shuangyue Wang, Yunhai Xiao, Zhengfen Jin,