ChatGPT helped identify woman’s rare condition after years of medical misdiagnosis

A 23-year-old woman from Cardiff says ChatGPT helped identify her rare neurological condition after four years of misdiagnoses from doctors. Phoebe Tesoriere was told she had anxiety, depression, and epilepsy before genetic testing confirmed the AI chatbot's suggestion of hereditary spastic paraplegia.
The case highlights both the potential and risks of patients using AI for medical guidance. While Phoebe eventually got her correct diagnosis, medical experts warn that AI chatbots can provide inconsistent health advice that may be dangerous if not discussed with healthcare professionals.
How did it work?
Phoebe's medical journey started in childhood with mobility issues. She had a limp from birth and balance problems, which doctors initially attributed to her being born without a hip socket. At 19, she collapsed and had a seizure at work, but doctors blamed anxiety.
Her symptoms worsened over several years:
- 2022: Diagnosed with epilepsy and given medication
- December 2024: Could not keep epilepsy medication down, leading to more seizures
- January 2025: Fell down stairs, spent three months in hospital with inconclusive tests
- July 2025: Major seizure left her in a coma for three days
After waking from the coma, a doctor told her she didn't have epilepsy after all - it was anxiety. That's when Phoebe decided to input her symptoms into ChatGPT.
The AI chatbot returned a list of possible conditions, including hereditary spastic paraplegia. Despite initial hesitation, Phoebe brought this suggestion to her GP, who agreed it was plausible. Genetic testing confirmed the diagnosis.
Why does it matter?
This case shows how AI tools might help patients advocate for themselves when facing diagnostic challenges. The NHS says hereditary spastic paraplegia is often misdiagnosed because it's so rare that exact patient numbers are unknown.
However, medical experts stress the risks of self-diagnosis through AI. Dr. Rebeccah Tomlinson, a GP in Cardiff, says AI can be a helpful starting point for conversations with doctors, but patients still need professional medical guidance.
The case also reflects broader pressures on healthcare systems. As Dr. Tomlinson notes: "It's difficult for GPs to know everything. With the pressure on the NHS, we have to know even more."
The context
AI chatbots are increasingly being used for health advice, with 230 million people asking ChatGPT health-related questions every week according to OpenAI. Earlier this year, a University of Oxford study found these tools provide mixed results - some good advice, some bad - making it hard for users to know what to trust.
OpenAI launched a new ChatGPT Health feature in the US in January that can analyze medical records, though the company says it's meant to "support, not replace, medical care." It's unclear when or if this feature will reach the UK.
For Phoebe, the diagnosis has been life-changing. She can no longer work as a special educational needs teacher and now uses a wheelchair. But she's found new direction, studying for a psychology masters degree because she still wants to help people. Her condition can be managed through physiotherapy, and she says having a proper diagnosis finally allows her to plan for the future with confidence.
💡Did you know?
You can take your DHArab experience to the next level with our Premium Membership.👉 Click here to learn more
🛠️Featured tool
Easy-Peasy
An all-in-one AI tool offering the ability to build no-code AI Bots, create articles & social media posts, convert text into natural speech in 40+ languages, create and edit images, generate videos, and more.
👉 Click here to learn more

