
Newsletter Subscribe
Enter your email address below and subscribe to our newsletter

Enter your email address below and subscribe to our newsletter
A 60-year-old U.S. man was hospitalized and placed in a psychiatric ward after developing bromism — a rare form of bromide poisoning — allegedly caused by advice from ChatGPT.
According to doctors at the University of Washington, the man spent three months replacing table salt (sodium chloride) with sodium bromide, purchased online, after the AI suggested it as a substitute. Bromide is toxic and was banned from food products in the U.S. decades ago.
The patient arrived at the ER with paranoia, hallucinations, and severe electrolyte imbalances. He believed his neighbor was poisoning him, but later admitted to following the AI’s suggestion. Tests showed bromide levels hundreds of times above normal.

After intensive treatment, including IV fluids and electrolyte correction, his symptoms resolved. Doctors warn the case shows how AI chatbots can spread decontextualized health information that poses serious risks when taken as medical advice.