Main Session
Sep 28
PQA 01 - Radiation and Cancer Physics, Sarcoma and Cutaneous Tumors

2093 - AI-Empowered MR-Guided Radiotherapy on a Conventional LINAC

02:30pm - 04:00pm PT
Hall F
Screen: 15
POSTER

Presenter(s)

Scott Hollingsworth, BS Headshot
Scott Hollingsworth, BS - Washington University in St. Louis School of Medicine, St Louis, MO

S. Hollingsworth1, Y. Yoon1, X. Li2, and T. Kim3; 1WashU Medicine, Department of Radiation Oncology, St. Louis, MO, 2Siemens Healthineers, St. Louis, MO, 3Wash U School of Medicine, Department of Radiation Oncology, St. Louis, MO

Purpose/Objective(s): Conventional radiotherapy lacks precision at high doses and in complex anatomical sites due to dynamic variations in patient anatomy. While MR-guided radiotherapy (MRgRT) improves accuracy with adaptive planning and real-time imaging, its complexity limits use in community hospitals. We present a practical approach enabling MRgRT on conventional LINACs in community clinics.

Materials/Methods: We developed a method to predict dynamic 3D MRIs from X-ray projections using a patient-specific deep learning network, given prior 3D cine MRI and simulation CT scans. The network extracts 2D features from the input to predict new volumes in real-time for tumor tracking and organ-at-risk management. High-resolution 3D cine MRIs are prepared using a separate super-resolution network. We validated our approach using 3D cine MRI data from 0.55T, 1.5T, and 3T MRI systems, as well as an IRB-approved clinical study. The study involved a patient undergoing surface-guided radiotherapy for a paracaval lymph node treated at 50 Gy in 5 fractions. The CT simulation was deformably registered to each image in the patient’s dynamic MRI set and subsequent digitally reconstructed radiographs (DRRs) were obtained for network training. For volunteer scans without CT data, DRRs were derived from deformed digital phantom CTs.

Results: The network successfully predicted dynamic MRIs from DRR-derived 2D features across multiple MRI field strengths. Visual inspection confirmed that predicted MRIs matched ground truth in clarity and anatomical boundaries (diaphragm, liver, kidneys, and patient surface). We measured the distance between ground truth and prediction when anatomical boundary variations were spotted. Variations in 0.55T and 1.5T data were around 0.6 cm or less, while 3T data showed more variations, with a maximum difference of 1.5 cm. From clinical study data, the network generated 100 dynamic MRI volumes in under 0.8 seconds with high anatomical accuracy. The paracaval lymph node appeared blurrier than the ground truth, but the directly adjacent liver boundary remained clear and correctly positioned. Memory limitations from lowering ground truth resolution may have contributed. Nonetheless, variations were minimal and difficult to measure, demonstrating patient-specific accuracy.

Conclusion: Our patient-specific deep learning network provides an accessible solution for community clinics to use MRgRT with conventional LINACs. Real-time imaging is possible from the speed of the network’s predictions. Deforming digital phantom data to match patient data could eliminate the need for CT scans. Alternatively, maintaining CT simulation and integrating MRgRT into conventional LINACs would enhance accessibility and adaptability, ensuring precise and personalized radiotherapy care.