ChatGPT diet plan leads New York man to the hospital: Here’s why

he doctors in the case study stressed the risks of misinformation from AI tools and noted that when they later asked ChatGPT the same question, it again suggested bromide without a specific health warning.

It was found that the chatbots generally responded appropriately to questions deemed to be very-low-risk.
It was found that the chatbots generally responded appropriately to questions deemed to be very-low-risk.

At a time when Tesla and xAI CEO Elon Musk claims that artificial intelligence can soon replace doctors, a startling situation has emerged in New York, highlighting the perils of non-medical expertise obtained from generative AI chatbots.  

An elderly man was recently hospitalised after following a diet plan created by an AI chatbot, ChatGPT, that led to a rare form of poisoning. The case, detailed in a report by the Annals of Internal Medicine: Clinical Cases, highlights the dangers of using AI for medical advice without professional consultation.

60-year-old man asks ChatGPT for diet plan

The 60-year-old New York man, who had no prior medical history, asked the AI tool for a diet plan to eliminate sodium chloride (table salt) from his diet. In response, ChatGPT suggested using sodium bromide as a substitute. The man, assuming the advice to be sound, followed the recommendations for three months. He purchased the compound online and adding it to his food. Bromide was once used in early 20th-century medicines for anxiety but is now known to be toxic in high doses.

The man eventually fell sick and was admitted to the hospital after experiencing severe neurological symptoms, including paranoia, hallucinations, and confusion. He also showed physical signs of toxicity, such as acne-like skin eruptions and distinctive red spots on his body. Doctors diagnosed him with bromide toxicity, a condition so rare it is now considered almost unheard of.

Human doctors save him from poisoning

The man recovered after spending three weeks in the hospital receiving treatment to rehydrate and restore his electrolyte balance. The doctors in the case study stressed the risks of misinformation from AI tools and noted that when they later asked ChatGPT the same question, it again suggested bromide without a specific health warning.

OpenAI, the developer of ChatGPT, states in its terms of use that its services are not intended for diagnosing or treating medical conditions and that users should not rely on the output as a substitute for professional advice.

However, many cases emerge where an individual relies on ChatGPT or any any other AI tool to seek medical help, thus putting themselves at risk. Recently, OpenAI surfaced the case of a cancer survivor who relied on advice from ChatGPT along with medical supervision to successfully defeat the deadly disease. 

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on August ten, twenty twenty-five, at fifty-six minutes past five in the evening.
Market Data
Market Data