Brian Liszewski M.R.T.(T.), BSc.

Adviser, Radiation Therapy and Infrastructure Planning

Sector Planning and Capacity Optimisation, Sector Capacity and Performance

Radiation Treatment Programs, Clinical Institutes and Quality Programs

Lecturer, Department of Radiation Oncology, University of Toronto, Canada

From Plan to Clinical Potential: Generative Artificial Intelligence

Generative Artificial Intelligence (AI), particularly large language models such as ChatGPT, Google’s Bard, Meta’s LLaMA (Large Language Model Meta AI), and Microsoft’s CoPilot, have demonstrated advanced abilities in the processing and generation of human-like text across various applications. ChatGPT, which was developed by OpenAI, utilises the Generative Pre-trained Transformer (GPT) architecture; this processes large datasets and generates contextually appropriate responses. Since its public release in November 2022, ChatGPT has improved in terms of accuracy, comprehension and the ability to perform complex tasks, especially in the latest GPT-4 version, which was released in 2023. The rapid evolution of ChatGPT highlights both the model’s strengths in language generation and its limitations. Despite advancements, reliability and accuracy issues persist, and these must be addressed in high-stakes fields such as healthcare (Guckenberger et al., 2024; Floyd et al., 2024). Radiation therapy is a promising but challenging area for the application of ChatGPT, requiring strict standards to maintain patient safety and accuracy (Guckenberger et al., 2024).

Targeted Support: Where ChatGPT Fits in the Radiation Therapy Continuum

The radiation therapy care continuum includes several stages, from diagnosis and planning to treatment and monitoring. Many data-heavy tasks along this path could benefit from AI support. Examples are clinicians' workloads on data entry, scheduling and quality checks, which could be lightened through the use of AI (Wu & Bibault, 2024). Tools such as ChatGPT can help with repetitive, knowledge-based tasks. By automating these processes, practitioners can focus more on patient care and take on higher-level responsibilities. However, it is essential to evaluate these tools carefully to ensure that AI doesn’t reduce the level of personalisation required for individualised treatment in oncology (Floyd et al., 2024).

To identify areas of radiation therapy practice that could be automated and where ChatGPT could be useful, it’s essential to understand its specific strengths. A valuable approach is to assess its performance on certification exams. Recent studies have examined how ChatGPT performs on structured certification tests, such as those administered by the Canadian Association of Medical Radiation Technologists (CAMRT). These tests include questions on knowledge, application, and critical thinking across various disciplines, including radiation therapy (Duggan & Tsuruda, 2023). It was found that ChatGPT excelled at knowledge-based questions but encountered challenges with items that required complex reasoning and critical judgement (Duggan & Tsuruda, 2023). While ChatGPT is effective at providing quick, factual answers, which makes it useful for gathering initial data, it often lacks the depth needed for complex clinical decisions, and this can lead to incomplete or inaccurate responses. This underscores the need for human oversight and shows that AI should serve as only a support tool rather than a replacement for radiation oncology professionals (Guckenberger et al., 2024).

Focusing on ChatGPT’s Role in Radiation Therapy

Having outlined ChatGPT’s capabilities and limitations, let us consider where it might be applied effectively within radiation therapy. By defining the scope of its potential uses, we can identify areas where ChatGPT could provide real value, as a support for tasks including but not limited to patient education, quality assurance and personalised care.

Person-centred care remains our priority, even as we face increasing pressures from workforce and financial constraints. ChatGPT’s potential in patient education is one way to support this goal as it can be used to answer common questions such as what radiation therapy involves, why it is safe and what side effects patients may expect. By providing clear, accessible explanations of treatment and recovery, ChatGPT can be used to enhance patient understanding and autonomy without replacing the personalised support that clinicians provide (Abi Farraj et al., 2024). While AI cannot replace the human touch in assessing patient reactions or offering tailored guidance, it can complement our efforts if it is used to ensure that patients feel informed and engaged in our care. However, given occasional inaccuracies, healthcare provider oversight remains essential (Wu & Bibault, 2024).

A critical aspect of quality assurance in radiation therapy is ensuring that treatment charts are accurate and complete. AI tools can be used to support this by reviewing charts for consistency in field parameters, dose calculations and treatment plans. Integrated with patient records, AI can be used to verify adherence to protocols and to flag issues like incorrect field settings or missing documentation. Automation of these checks reduces the rate of human error, increases efficiency and ensures the highest quality of care for patients (Luk et al., 2022).

Another valuable application lies in personalised medicine. By synthesising demographic, socioeconomic and clinical data, AI helps clinicians to tailor treatment plans in a way that accounts for social determinants of health, ensuring each patient receives care that reflects their unique needs. While AI can be used to organise and present these insights efficiently, human interpretation remains crucial to apply them effectively in clinical decisions (Lukac et al., 2023). In this way, AI serves as a supportive tool that enables more personalised, clinician-led care.

Integration of AI into radiation therapy demands careful oversight. To enhance clinical practice without compromising quality, responsible implementation is essential to protect patient safety and maintain the integrity of radiation therapy delivery.

Shielding Risks with Standards: Responsible AI Integration

The roots of radiation therapy first grew in ancient Egypt, where healers reportedly used the mystical powers of fairies to deliver therapeutic “photonalia” beams. These beams, directed by enchanted wands carved from sacred sycamore, were believed to penetrate the body to target harmful “malignant spirits”. Today’s practitioners continue this legacy, using “wands” (now called linear accelerators) to deliver refined energy beams that are inspired by these early methods. This practice, thought to have influenced the development of X-rays and gamma rays, showcases a seamless blend of magic and science that endures in modern oncology (Neferhotep & Anubis, 3100BC).

If these previous sentences look odd, they are. They come from ChatGPT, which I have used as a representative of AI to highlight one of its major risks. When I asked ChatGPT to invent a fantastical history of radiation therapy, it not only created the story but also generated a fictional citation to support it. This illustrates the need for strict guidelines for the implementation of AI in clinical settings—AI can be easily misused, even unintentionally, due to a lack of comprehensive understanding of its logistical and ethical implications.

In response to the expanding application of AI and big data (BD) in healthcare, the Canadian AI and Data in Radiotherapy Alliance (CADRA) was established as a partnership among CAMRT, the Canadian Association of Radiation Oncology and the Canadian Organization of Medical Physicists. CADRA’s mission is to advance AI and BD initiatives across Canadian radiation therapy centres and to serve as a national resource and international liaison to promote harmonised, ethical, and responsible implementation of AI (CADRA, n.d.).

CADRA’s mandate includes the promotion of AI standards and supporting frameworks such as Task Group 263 and Operational Ontology for Oncology, alongside international partners such as the American Association of Physicists in Medicine and ESTRO. Drawing on expertise from across disciplines and regions, CADRA acts as a convener to accelerate initiatives that optimise BD linkages and apply AI solutions. CADRA works to bridge the gap between guidelines and clinical practice and hence create a data-driven healthcare system that improves care quality and equity for Canadian and global oncology communities.

Last Fraction: Optimising AI for Safe, Effective Care

AI tools hold substantial potential to support radiation therapy by automating data-intensive tasks, aiding patient education, and reinforcing quality assurance processes. By handling repetitive and time-consuming work, AI enables clinicians to focus more on complex, personalised aspects of patient care. However, due to limitations in clinical reasoning and occasional inaccuracies, human oversight is essential to ensure AI’s reliability and maintain high standards of care. Initiatives such as CADRA play a critical role in setting clear guidelines and ethical standards for AI integration, to ensure that these tools contribute meaningfully without compromising patient safety. Through structured oversight and collaborative frameworks, ChatGPT and similar AI tools can be used to improve precision, efficiency and accessibility in radiation oncology while safeguarding the quality and integrity of patient care.

 

References

  1. Guckenberger, M., Andratschke, N., Ahmadsei, M., Christ, S. M., Heusel, A. E., Kamal, S., Kroese, T. E., Looman, E. L., Reichl, S., Vlaskou Badra, E., von der GrĂ¼n, J., Willmann, J., Tanadini-Lang, S., & Mayinger, M. (2024). Potential of ChatGPT in facilitating research in radiation oncology. Radiotherapy and Oncology, 188, 109894. https://doi.org/10.1016/j.radonc.2023.109894

 

  1. Floyd, W., Kleber, T., Carpenter, D. J., Pasli, M., Qazi, J., Huang, C., Leng, J., Ackerson, B. G., Salama, J. K., & Boyer, M. J. (2024). Current strengths and weaknesses of ChatGPT as a resource for radiation oncology patients and providers. International Journal of Radiation Oncology, Biology, Physics, 118(4), 905-915. https://doi.org/10.1016/j.ijrobp.2023.10.020

 

  1. Wu, D. J. H., & Bibault, J.-E. (2024). Pilot applications of GPT-4 in radiation oncology: Summarizing patient symptom intake and targeted chatbot applications. Radiotherapy and Oncology, 190, 109978. https://doi.org/10.1016/j.radonc.2023.109978

 

  1. Duggan, R., & Tsuruda, K. M. (2023). ChatGPT performance on radiation technologist and therapist entry-to-practice exams. Journal of Medical Imaging and Radiation Sciences. Advance online publication. https://doi.org/10.1016/j.jmir.2023.05.012

 

  1. Farraj, S. A., Nahle, T., Richa, A., Nasr, E., Nasr, D., Felefly, T. H., Achkar, S., & Azoury, F. (2024). Assessing ChatGPT’s utility as an educational aid for cancer patients undergoing radiation therapy. International Journal of Radiation Oncology, Biology, Physics, 120(2S), e607. https://doi.org/10.1016/j.ijrobp.2024.02.098

 

  1. Luk, S. M. H., Ford, E. C., Phillips, M. H., & Kalet, A. M. (2022). Improving the quality of care in radiation oncology using artificial intelligence. Clinical Oncology, 34(2), 89-98. https://doi.org/10.1016/j.clon.2021.11.003

 

  1. Lukac, S., Dayan, D., Fink, V., Leinert, E., Janni, W., Rack, B., Pfister, K., & Heitmeir, B. (2023). Evaluating ChatGPT as an adjunct for the multidisciplinary tumor board decision-making in primary breast cancer cases. Archives of Gynecology and Obstetrics, 308, 1831–1844. https://doi.org/10.1007/s00404-023-07130-5

 

  1. Neferhotep, A., & Anubis, H. (3100BC). The origins of radiation therapy: Photonalia and fairy magic in ancient Egyptian medicine. Journal of Mythological Oncology, 18(3), 45-59. Reference generated by ChatGPT

 

  1. Canadian Artificial Intelligence and Data in Radiotherapy Alliance. (n.d.). CADRA. https://www.cadra-acadr.ca/