We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

AI Tool Matches Physicians in Mapping Moving Lung Tumors

A scan image of a set of lungs in blue on a black background.
Credit: iStock
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 1 minute

Researchers at Northwestern University have developed a three-dimensional deep learning model that can match the performance of radiation oncologists in identifying and outlining lung tumors on computed tomography (CT) scans. The system, called iSeg, was trained and validated across data from nine clinics in two US health systems.


In a study published in npj Precision Oncology, iSeg demonstrated the ability to automate tumor segmentation with high consistency across hospitals and scan types. It also identified high-risk tumor regions that had been missed in some physician-drawn outlines.

Addressing variation in tumor mapping

Radiation therapy requires precise targeting to destroy cancer cells while sparing surrounding healthy tissue. A key step in this process is tumor segmentation — delineating the edges of a tumor on medical images — which remains a manual, time-consuming, and variable process.


To address these challenges, the researchers trained iSeg using hundreds of CT scans and corresponding annotations created by physicians. The tool was then tested on a separate dataset, with its results compared against new physician outlines.


The team reported that iSeg reliably replicated expert-level contours and flagged additional tumor areas not captured by some clinicians. These missed regions were associated with poorer treatment outcomes in retrospective analyses, suggesting that the AI tool may assist in reducing such risks.

Adapting to tumor movement during breathing

Unlike earlier tools that analyze static images, iSeg was designed to account for tumor motion caused by breathing — a common source of error in thoracic radiation planning. This dynamic segmentation could help improve the accuracy of treatment for lung cancer patients.


Approximately half of all patients with cancer in the United States receive radiation therapy during the course of their illness. The ability to automate and standardize tumor mapping could help reduce delays and support consistent care, particularly in settings with limited access to subspecialty expertise.

Future applications and clinical use

The research group is currently evaluating iSeg in real-time clinical settings. They are also expanding its capabilities to segment tumors in other organs, such as the liver, brain, and prostate. Plans are underway to adapt the tool for additional imaging modalities, including magnetic resonance imaging (MRI) and positron emission tomography (PET).


According to the study team, iSeg could be integrated into clinical workflows within the next few years, potentially serving as a decision-support tool for radiation oncologists.

 

Reference: L Sarkar S, Teo PT, Abazeed ME. Deep learning for automated, motion-resolved tumor segmentation in radiotherapy. npj Precis. Onc. 2025. doi:10.1038/s41698-025-00970-1


This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source. Our press release publishing policy can be accessed here.

This content includes text that has been generated with the assistance of AI. Technology Networks' AI policy can be found here.