کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
533502 | 870124 | 2011 | 18 صفحه PDF | دانلود رایگان |
We present a graph-cuts based method for non-rigid medical image registration on brain magnetic resonance images. In this paper, the non-rigid medical image registration problem is reformulated as a discrete labeling problem. Based on a voxel-to-voxel intensity similarity measure, each voxel in the source image is assigned a displacement label, which represents a displacement vector indicating which position in the floating image it is spatially corresponding to. In the proposed method, a smoothness constraint based on the first derivative is used to penalize sharp changes in the adjacent displacement labels across voxels. The image registration problem is therefore modeled by two energy terms based on intensity similarity and smoothness of the displacement field. These energy terms are submodular and can be optimized by using the graph-cuts method via α‐expansionsα‐expansions, which is a powerful combinatorial optimization tool and capable of yielding either a global minimum or a local minimum in a strong sense. Using the realistic brain phantoms obtained from the Simulated Brain Database, we compare the registration results of the proposed method with two state-of-the-art medical image registration approaches: free-form deformation based method and demons method. In addition, the registration results are also compared with that of the linear programming based image registration method. It is found that the proposed method is more robust against different challenging non-rigid registration cases with consistently higher registration accuracy than those three methods, and gives realistic recovered deformation fields.
► We model the non-rigid registration as a multi-labeling problem by Markov random field.
► The MRF energy is minimized by graph-cuts algorithm via α‐expansionsα‐expansions.
► A lot of experiments were done to evaluate our method in term of accuracy and smoothness.
► We compared the proposed method with three state-of-the-art methods.
► Our method is more robust against different challenging cases with consistently higher accuracy.
Journal: Pattern Recognition - Volume 44, Issues 10–11, October–November 2011, Pages 2450–2467