Encrypted login | home

Program Information

Semi-Automatic Segmentation of the Prostate Midgland in Magnetic Resonance Images Using Shape and Local Appearance Similarity Analysis

no image available
M Shahedi

M Shahedi123*, A Fenster1345, C Romagnoli5, A D Ward234, (1) Imaging Research Laboratories, Robarts Research Institute, Western University, London, ON, Canada, (2) Baines Imaging Research Laboratory, London Health Sciences Centre, London, ON, Canada, (3) Graduate Program in Biomedical Engineering, Western University, London, ON, Canada, (4) Department of Medical Biophysics, Western University, London, ON, Canada, (5) Department of Medical Imaging, Western University, London, ON, Canada

MO-G-BRA-3 Monday 5:15:00 PM - 6:00:00 PM Room: Ballroom A

Purpose: To design, develop and test a semi-automatic segmentation method for the prostate midgland on T2W magnetic resonance (MR) images acquired using an endorectal (ER) coil, based on inter-subject prostate shape and local boundary appearance similarity.

Method and Materials: We used T2W prostate MR images of 33 subjects acquired using an ER coil. From each image, we extracted the closest mid-gland axial slice to the verumontanum. We partitioned the images into training and test sets using leave-one-out cross-validation. We used the training images to define a point distribution model (PDM) describing shape variability as well as 36 circular 'mean intensity patches' characterizing the inter-subject local appearance at anatomically-defined boundary locations. We chose a patch radius of 17mm based on our assessment of segmentation error, using the mean absolute boundary distance (MAD), on a subset of 13 images during a systematic radius search from 5mm to 25mm. For each test image, we defined 36 homologous rays emanating from the centre of the prostate. We used a radial-based-search strategy to translate each mean intensity patch along its corresponding ray and computed the normalized cross-correlation (NCC) between the mean patch and the underlying patch in the test image at each point. We chose the point with the highest NCC along each ray. We then used the PDM to regularize the selected points. To compare the algorithm's segmentation to manual delineations performed by one operator, we calculated the MAD and Dice similarity coefficient (DSC).

Results: We measured a MAD of 1.6 +/- 1.0mm, and a DSC of 89 +/- 6% between the semi-automatically segmented contours and the manually-delineated reference standard.

Conclusions: NCC of local prostate boundary regions with a learned mean boundary appearance may be suitable for boundary localization and subsequent refinement using a PDM on T2W MR images acquired using an ER coil.


Contact Email