کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
444101 | 692882 | 2012 | 17 صفحه PDF | دانلود رایگان |
A deformable registration method is described that enables automatic alignment of magnetic resonance (MR) and 3D transrectal ultrasound (TRUS) images of the prostate gland. The method employs a novel “model-to-image” registration approach in which a deformable model of the gland surface, derived from an MR image, is registered automatically to a TRUS volume by maximising the likelihood of a particular model shape given a voxel-intensity-based feature that represents an estimate of surface normal vectors at the boundary of the gland. The deformation of the surface model is constrained by a patient-specific statistical model of gland deformation, which is trained using data provided by biomechanical simulations. Each simulation predicts the motion of a volumetric finite element mesh due to the random placement of a TRUS probe in the rectum. The use of biomechanical modelling in this way also allows a dense displacement field to be calculated within the prostate, which is then used to non-rigidly warp the MR image to match the TRUS image. Using data acquired from eight patients, and anatomical landmarks to quantify the registration accuracy, the median final RMS target registration error after performing 100 MR–TRUS registrations for each patient was 2.40 mm.
Deformable registration between MR and ultrasound images of prostate using a statistical motion model and a novel probabilistic model-to-image registration algorithm.Figure optionsDownload high-quality image (132 K)Download as PowerPoint slideResearch highlights
► Ultrasound-probe-induced prostate motion is an important source of registration error.
► A biomechanically-informed statistical shape model constrains allowed deformations.
► Surface normal alignment is a robust and efficient approach to image registration.
► Deformable registration enables fast and accurate data fusion during prostate interventions.
Journal: Medical Image Analysis - Volume 16, Issue 3, April 2012, Pages 687–703