AI rewrites medical scan reports to make them readable for patients

Medical scan reports are notoriously hard to understand. They're packed with technical terms, abbreviations, and jargon that can confuse patients and create unnecessary worry.
New research suggests AI tools like ChatGPT could help solve this problem. A major review found these systems can successfully rewrite complex radiology reports into plain English that ordinary people can actually understand.
The findings come as patients gain more direct access to their medical records through apps and online portals, making clear communication more important than ever.
How does it work?
Researchers examined 38 studies covering more than 12,000 radiology reports that had been simplified using AI systems. The reports covered X-rays, CT scans, and MRI results.
The AI tools took reports written at university level and rewrote them for 11 to 13-year-olds to understand. Patients, members of the public, and medical professionals then assessed how well the simplified versions worked.
Most clinicians said the AI-rewritten reports were accurate and complete. However, about 1 percent contained errors, including at least one incorrect diagnosis. This suggests doctors still need to check the simplified versions before sharing them with patients.
Why does it matter?
Dr Samer Alabed, who led the research at the University of Sheffield, explains the core problem: "The fundamental issue with these reports is they're not written with patients in mind."
Complex medical language creates real barriers. Patients with lower health literacy or those who speak English as a second language struggle most. The confusion can lead to unnecessary anxiety or false reassurance.
The problem wastes valuable time too. Doctors often spend appointment time explaining report terminology instead of focusing on actual care and treatment. "Even small time savings per patient could add up to significant benefits across the NHS," Alabed said.
The context
This research comes as healthcare systems worldwide push for greater transparency. In the UK, services like the NHS App give patients direct access to their medical records. Similar policies are expanding access to health information globally.
But there's a catch. None of the 38 studies took place in UK or NHS settings. The Sheffield research team says it's now working to fill this gap with real-world testing.
"Our long-term goal is not to replace clinicians, but to support clearer, kinder, and more equitable communication in healthcare," Alabed said. The team wants to develop oversight models where doctors review and approve AI-generated explanations before patients see them.
The approach could make medical information more accessible while maintaining safety standards. But proper testing in actual healthcare settings will be essential before widespread adoption.
💡Did you know?
You can take your DHArab experience to the next level with our Premium Membership.👉 Click here to learn more
🛠️Featured tool
Easy-Peasy
An all-in-one AI tool offering the ability to build no-code AI Bots, create articles & social media posts, convert text into natural speech in 40+ languages, create and edit images, generate videos, and more.
👉 Click here to learn more

