Enter your email address below and subscribe to our newsletter

American Hospitalized After Following ChatGPT’s Deadly Salt Swap Advice

Share your love

A 60-year-old U.S. man was hospitalized and placed in a psychiatric ward after developing bromism — a rare form of bromide poisoning — allegedly caused by advice from ChatGPT.

According to doctors at the University of Washington, the man spent three months replacing table salt (sodium chloride) with sodium bromide, purchased online, after the AI suggested it as a substitute. Bromide is toxic and was banned from food products in the U.S. decades ago.

The patient arrived at the ER with paranoia, hallucinations, and severe electrolyte imbalances. He believed his neighbor was poisoning him, but later admitted to following the AI’s suggestion. Tests showed bromide levels hundreds of times above normal.

American Hospitalized After Following ChatGPT’s Deadly Salt Swap Advice

After intensive treatment, including IV fluids and electrolyte correction, his symptoms resolved. Doctors warn the case shows how AI chatbots can spread decontextualized health information that poses serious risks when taken as medical advice.

Поделитесь с друзьями

Respawn your news feed —
join the Geeksace community!

Обзор конфиденциальности

На этом сайте используются файлы cookie, что позволяет нам обеспечить наилучшее качество обслуживания пользователей. Информация о файлах cookie хранится в вашем браузере и выполняет такие функции, как распознавание вас при возвращении на наш сайт и помощь нашей команде в понимании того, какие разделы сайта вы считаете наиболее интересными и полезными.