Dr. Sudhir's Pain Relief Clinic

August 29, 2025

A 60-Year-Old Man Nearly Lost His Life Because of ChatGPT — Here’s the Truth

Untitled design (43)

Artificial Intelligence (AI) is shaping how we live, work, and even make health choices. With instant answers at our fingertips, many people are tempted to ask AI instead of visiting a doctor. But what happens when the advice is dangerously wrong? A recent case shows us just how serious the consequences can be.

The Shocking Case

A 60-year-old man wanted to reduce his salt intake for better health. Instead of consulting a medical professional, he asked ChatGPT for guidance. The chatbot suggested replacing table salt with sodium bromide — a compound that is toxic and no longer used in medicine.

For nearly three months, he unknowingly consumed bromide every single day. Soon, his health began to collapse:

  • He developed paranoia and terrifying hallucinations, believing his neighbor was trying to poison him.
  • He experienced severe fatigue, insomnia, and difficulty coordinating movements.
  • His skin developed unusual signs, including acne-like eruptions and red spots known as cherry angiomas.

Doctors eventually diagnosed him with bromism, a rare but dangerous poisoning that can affect the brain and nervous system. Without urgent treatment, it can cause confusion, seizures, and even coma. Fortunately, with proper medical care, he survived and recovered.

The Hidden Danger of Blindly Following AI

This case is a clear warning: while AI tools can provide information, they are not doctors. They cannot:

  • Review your medical history.
  • Evaluate whether advice is safe in your situation.
  • Adjust care when complications appear.

AI can offer knowledge, but it cannot guarantee safety. Blindly trusting it can put your life in danger.

Why Doctors Still Matter

At Dr. Sudhir’s Pain Relief Clinic, we welcome technology as a support tool, but it can never replace medical expertise. Only a doctor can:

  • Provide personalized treatment based on your condition.
  • Ensure the advice is safe, tested, and effective.
  • Monitor progress and guide recovery with care.

Many patients say that after meeting a doctor, they feel relieved, reassured, and confident about their health decisions. That sense of security is something no AI chatbot can provide.

How You Can Stay Safe

  • Use AI only for general awareness, not as medical instructions.
  • Always double-check health advice with a qualified doctor.
  • Protect your well-being by relying on expert medical care.

Final Word

A 60-year-old man nearly lost his life because of advice from ChatGPT. His story is a reminder that while AI can be useful, it cannot replace human judgment in healthcare. At Dr. Sudhir’s Pain Relief Clinic, our focus is keeping you safe, healthy, and informed — because nothing is more valuable than your health.

_______________________________________________________________________________________________________________________________________________________________

 To our specialists at 03369028275.

📲 WhatsApp Channel
📺 YouTube Channel