Is ChatGPT Safe For Medical Advice? Man Lands In ICU After Following AI’s Dangerous Tip

A 60-year-old New York man was hospitalised after following a dangerous dietary recommendation generated by ChatGPT, raising urgent questions about the safety of using artificial intelligence for medical guidance. The crisis began when the man asked ChatGPT for healthier alternatives to table salt. Among the AI’s suggestions was sodium bromide, a toxic chemical compound once used in sedatives but banned in food due to its harmful effects.

Believing it to be a safe substitute for sodium chloride, he consumed it for three months, leading to a rare and life-threatening case of bromide poisoning, or bromism. The incident, documented by doctors from Washington University in the Annals of Internal Medicine: Clinical Cases, is believed to be one of the first reported cases of severe poisoning directly linked to AI-generated dietary advice.

ALSO READ: From Kargil To Operation Sindoor: How Pakistan’s Nuclear Threat Remains A Long-Standing Rhetoric Amid Indo-Pak Tensions

Dangerous Health Decline

According to the report, the man followed a strict salt-reduction plan suggested by the AI without consulting a doctor. Over time, he developed hyponatraemia, dangerously low sodium levels, alongside escalating neurological and dermatological problems.

When admitted to the hospital, the patient initially denied taking any medications or supplements. He later disclosed that he had been using sodium bromide purchased online, while also following other strict dietary restrictions and distilling his own water at home.

During his hospital stay, the man’s condition worsened, with paranoia, hallucinations, and extreme thirst. The case report noted that he was “paranoid about the water he was offered” and believed his neighbour was trying to poison him.

Doctors diagnosed bromism, a condition now rare in developed countries. He was treated with fluids and electrolytes, eventually becoming medically stable and transferring to the hospital’s inpatient psychiatry unit for further care.

Recovery And Warnings

The man spent three weeks in the hospital before recovering. His case serves as a cautionary tale about the dangers of relying on AI for medical advice without professional oversight, especially when it concerns essential nutrients like sodium.

Leave a Comment