We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Can Cancer Patients Count on ChatGPT for Good Advice?

A computer screen showing the ChatGPT homepage.
Credit: Emiliano Vittoriosi/ Unsplash
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 1 minute

study in the Journal of The National Cancer Institute Cancer Spectrum looked at chatbots and artificial intelligence (AI), as they become popular resources for cancer information. They found these resources give accurate information when asked about common cancer myths and misconceptions. In the first study of its kind, Skyler Johnson, MD, physician-scientist at Huntsman Cancer Institute and assistant professor in the department of radiation oncology at the University of Utah (the U), evaluated the reliability and accuracy of ChatGPT’s cancer information.


Using the National Cancer Institute’s (NCI) common myths and misconceptions about cancer, Johnson and his team found that 97% of the answers were correct. However, this finding comes with some important caveats, including a concern amongst the team that some of the ChatGPT answers could be interpreted incorrectly. “This could lead to some bad decisions by cancer patients. The team suggested caution when advising patients about whether they should use chatbots for information about cancer,” says Johnson.


The study found reviewers were blinded, meaning they didn’t know whether the answers came from the chatbot or the NCI. Though the answers were accurate, reviewers found ChatGPT’s language was indirect, vague, and in some cases, unclear. 

Want more breaking news?

Subscribe to Technology Networks’ daily newsletter, delivering breaking science news straight to your inbox every day.

Subscribe for FREE
“I recognize and understand how difficult it can feel for cancer patients and caregivers to access accurate information,” says Johnson. “These sources need to be studied so that we can help cancer patients navigate the murky waters that exist in the online information environment as they try to seek answers about their diagnoses.”


Incorrect information can harm cancer patients. In a previous study by Johnson and his team published in the Journal of the National Cancer Institute, they found that misinformation was common on social media and had the potential to harm cancer patients.


The next steps are to evaluate how often patients are using chatbots to seek out information about cancer, what questions they are asking, and whether AI chatbots provide accurate answers to uncommon or unusual questions about cancer.


Reference: Johnson SB, King AJ, Warner EL, Aneja S, Kann BH, Bylund CL. Using ChatGPT to evaluate cancer myths and misconceptions: artificial intelligence and cancer information. JNCI Cancer Spectrum. 2023;7(2):pkad015. doi: 10.1093/jncics/pkad015


This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.