AI Tool Uses Radiology Reports for Cancer Outcomes

AI Tool Uses Radiology Reports for Cancer Outcomes

Researchers from the Dana-Farber Cancer Institute have developed an artificial intelligence (AI) tool which is just as effective as their human counterparts in gathering information on cancer progression from unstructured radiology reports. The study published in JAMA Oncology showed how AI could not only distinguish cancer presence and the outcomes of the patient but could also complete this at a faster rate.

Electronic health records (EHRs) have the potential to hold a lot of data about a patient. However, further information regarding cancer progression is normally only noted in the text of the medical record, unless the patient is a participant of a clinical trial. This unstructured information meant that this information was previously not available for computer analysis and further reviews on this had to be conducted manually.

Kenneth Kehl, M.D., M.P.H., medical oncologist at Dana-Farber and corresponding author, said the goal of the study was to discover whether the AI technology could extract important information about the cancer status of the patient from radiology reports. Centres such as Dana-Farber can generate a lot of information about patients and their progress, for example, through projects such as the Profile initiative at Dana-Farber/Brigham and Women’s Cancer Center. This study allowed researchers to analyse tumour samples and build profiles on genetic variants which could impact how patients would respond to certain treatments. However, as Kehl explains, the difficulties in implementing precision medicine don’t always lie in gathering the data but rather in applying this information effectively to determine patient response.

In this latest study, over 14,000 imaging reports were manually reviewed for 1,112 patients using the PRISSMM framework. PRISSMM is a phenomic data standard which can translate the unstructured data from the medical text in EHRs and convert this to readily-analysed material. Using this method takes into account any symptoms, pathology, molecular markers and also the medical oncologists’ assessment to predict patient outcomes.

Initially, human reviewers generated outcomes based on the imaging text reports and used this to ‘teach’ a deep learning computer model to predict the extent of cancer presence over time. Outcomes were classified by progression-free survival, disease-free survival and time to improvement/response. The results of the study showed that AI could replicate the outcomes found by human reviewers. Researchers then went on to let the AI algorithm to predict the outcomes of 1,294 patients from 15,000 reports that had not first been manually reviewed. Again, results were predicted at a similar accuracy to human reviewers.




Next Article

Did you find this useful?

Medigy Innovation Network

Connecting innovation decision makers to authoritative information, institutions, people and insights.

Medigy Logo

The latest News, Insights & Events

Medigy accurately delivers healthcare and technology information, news and insight from around the world.

The best products, services & solutions

Medigy surfaces the world's best crowdsourced health tech offerings with social interactions and peer reviews.


© 2024 Netspective Media LLC. All Rights Reserved.

Built on Apr 26, 2024 at 6:14am