Enter your email address below and subscribe to our newsletter

American Hospitalized After Following ChatGPT’s Deadly Salt Swap Advice

Share your love

A 60-year-old U.S. man was hospitalized and placed in a psychiatric ward after developing bromism — a rare form of bromide poisoning — allegedly caused by advice from ChatGPT.

According to doctors at the University of Washington, the man spent three months replacing table salt (sodium chloride) with sodium bromide, purchased online, after the AI suggested it as a substitute. Bromide is toxic and was banned from food products in the U.S. decades ago.

The patient arrived at the ER with paranoia, hallucinations, and severe electrolyte imbalances. He believed his neighbor was poisoning him, but later admitted to following the AI’s suggestion. Tests showed bromide levels hundreds of times above normal.

American Hospitalized After Following ChatGPT’s Deadly Salt Swap Advice

After intensive treatment, including IV fluids and electrolyte correction, his symptoms resolved. Doctors warn the case shows how AI chatbots can spread decontextualized health information that poses serious risks when taken as medical advice.

Share your love

Respawn your news feed —
join the Geeksace community!

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.