A 3D Camera Test-Bed for Real-Time Monitoring and Quantitative Tracking of Radiotherapy Treatment
Y Min*, A Santhanam, D Low, P Kupelian, UCLA, Los Angeles, CASU-E-J-156 Sunday 3:00PM - 6:00PM Room: Exhibit Hall
Purpose: To investigate methods for 3D camera-based monitoring and quantitative tracking in the radiation therapy treatment room.
Methods and Materials: A laboratory test-bed for installing 3D cameras and testing 3D tracking algorithms was built to develop techniques for 3D treatment room monitoring. Kinect cameras were used as the 3D camera platforms. The cameras were anchored to a rigid framework attached to the laboratory walls to avoid vibrations and camera motion drift. Spatio-temporal clustering algorithm was employed to remove camera fluttering noise. Features whose depth estimation varied by more than 3 mm in 1/3rd of a second were removed. A client server framework enabled the 3D treatment space to be visualized by remotely located experts in real-time. A scalable multi GPU system that enabled 3D treatment space rendering in stereo and in real-time was employed on the server side. For tracking both the patient body surface as well as the treatment room equipment, we investigated two different algorithms, using a Point Feature Histogram and multi-resolution 3D Hough transform. For real-time tracking we, also employed 2D multi-resolution optical flow for effective temporal tracking on radiation equipment as well as the patient anatomy.
Results: For remotely located experts, the treatment space visualization was conducted at 40 fps with a resolution of 640 x 480 pixels. Sub-millimeter accuracy for tracking the patient body surface as well as the treatment room equipment was obtained when the tracked features were <120 cm from any of the 3D cameras, degrading with increasing distance.
Conclusion: Remote real-time stereoscopic patient setup visualization is feasible, enabling expansion of high quality radiation therapy into challenging environments. For both tested tracking approaches, the run-time analysis showed the need for GPU hardware to track features in real-time.