Encrypted login | home

Program Information

Deep Convolutional Neural Network Image Matching for Ultrasound Guidance in Radiotherapy

no image available
N Zhu

N Zhu*, M Najafi , S Hancock , D Hristov , Stanford University Cancer Center, Palo Alto, CA


SU-C-207B-7 (Sunday, July 31, 2016) 1:00 PM - 1:55 PM Room: 207 B

Purpose:Robust matching of ultrasound images is a challenging problem as images of the same anatomy often present non-trivial differences. This poses an obstacle for ultrasound guidance in radiotherapy. Thus our objective is to overcome this obstacle by designing and evaluating an image blocks matching framework based on a two channel deep convolutional neural network.

Methods:We extend to 3D an algorithmic structure previously introduced for 2D image feature learning [1]. To obtain the similarity between two 3D image blocks A and B, the 3D image blocks are divided into 2D patches Ai and Bi. The similarity is then calculated as the average similarity score of Ai and Bi. The neural network was then trained with public non-medical image pairs, and subsequently evaluated on ultrasound image blocks for the following scenarios: (S1) same image blocks with/without shifts (A and A_shift_x); (S2) non-related random block pairs; (S3) ground truth registration matched pairs of different ultrasound images with/without shifts (A_i and A_reg_i_shift_x).

Results:For S1 the similarity scores of A and A_shift_x were 32.63, 18.38, 12.95, 9.23, 2.15 and 0.43 for x=ranging from 0 mm to 10 mm in 2 mm increments. For S2 the average similarity score for non-related block pairs was -1.15. For S3 the average similarity score of ground truth registration matched blocks A_i and A_reg_i_shift_0 (1≤i≤5) was 12.37. After translating A_reg_i_shift_0 by 0 mm, 2 mm, 4 mm, 6 mm, 8 mm, and 10 mm, the average similarity scores of A_i and A_reg_i_shift_x were 11.04, 8.42, 4.56, 2.27, and 0.29 respectively.

Conclusion:The proposed method correctly assigns highest similarity to corresponding 3D ultrasound image blocks despite differences in image content and thus can form the basis for ultrasound image registration and tracking.

[1] Zagoruyko, Komodakis, "Learning to compare image patches via convolutional neural networks", IEEE CVPR 2015,pp.4353-4361.

Contact Email: