Encrypted login | home

Program Information

The Preliminary Study of Incorporating the Novel Deep-Learning Method for Dose Distribution Prediction in the Complex Head-And-Neck Cancers

no image available
J Fan

J Fan*, J Wang , W Hu , Y Chen , H Zhang , Fudan University Shanghai Cancer Center, Shanghai, Shanghai

Presentations

SU-K-FS1-15 (Sunday, July 30, 2017) 4:00 PM - 6:00 PM Room: Four Seasons 1


Purpose: The purpose of this study is to develop a fast automatic algorithm based on the deep-learning to predict the 3D dose distribution for ensuring the treatment plan quality and consistency.

Methods: A novel deep-learning approach is developed that can predict the achievable 3D dose distribution for the new patient based on the geometrical features. Two major steps are involved in the algorithm. One is the feature extraction, which extracts the contours and the corresponding dose volumes from the plan database. A new visualization toolkit (VTK) algorithm is applied to convert contours and datasets of isodose distribution to a matrix. The other is the relationship learning. From the historical dose and contour features, a predictive model based on residual network (ResNet) is trained and dose features are correlated with anatomical features. DVHs of organs at risk (OARs) are generated from the predicted 3D dose distributions.

Results: Fifty head-and-neck patients with IMRT plans were used to train the models. Our preliminary results using a 50-layer ResNet achieve more than 50% dose volume in which the dose differences between the predicted and actual value are within 10%. Our results show its feasibility for dose distribution prediction in the complex head-and-neck cancer.

Conclusion: A deep-learning based dose prediction model for IMRT treatment is established. The novel algorithm realizes the dose distribution prediction based on contours. Deep-learning method is used to derive a contour-to-dose relation map. The resulted dose map can serve as the knowledge for the treatment planning QA or automatic planning.


Contact Email: