A man gave himself bromism, a psychiatric disorder that has not been common for many decades, after asking ChatGPT for advice and accidentally poisoning himself, according to a case study published this week in the Annals of Internal Medicine.
In this case, a man showed up in an ER experiencing auditory and visual hallucinations and claiming that his neighbor was poisoning him. After attempting to escape and being treated for dehydration with fluids and electrolytes, the study reports, he was able to explain that he had put himself on a super-restrictive diet in which he attempted to completely eliminate salt. He had been replacing all the salt in his food with sodium bromide, a controlled substance that is often used as a dog anticonvulsant.
He said that this was based on information gathered from ChatGPT.
“After reading about the negative effects that sodium chloride, or table salt, has on one’s health, he was surprised that he could only find literature related to reducing sodium from one’s diet. Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,” the case study reads. “For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”
The case study was also reported on by Ars Technica.
I was able to recreate a similar example interaction in one question on the morning of August 7th. I asked “what can chloride be replaced with?” and the bot replied “if you’re referring to replacing chloride ions (CI) in salts (like sodium chloride, NaCl), you can often substitute it with other halide ions such as: Sodium Bromide (NaBr): Replacing chloride with bromide.”
The 60-year-old man started doing just that. He spent three weeks in hospital as his psychotic symptoms slowly subsided.
To be fair to the bot, it did go on to ask me “do you have a specific context in mind?” and when I added “in food” it gave me a list of other salty things including MSG and liquid aminos. On the other hand, it did not tell me not to eat sodium bromide.
I tried ChatGPT again with another question that confirmed I was talking about sodium chloride specifically. The bot hedged its bets a bit by saying “yes… in some contexts”. But it failed to point out up top that a big, no 1, primary use case for sodium chloride (table salt) is human consumption.
The case study authors found similar, saying that when they tried to recreate the situation themselves, the bot did not “inquire about why we wanted to know, as we presume a medical professional would do.” There is both anecdotal and clinical evidence that AI can be helpful in a health context. However, this is a case of consulting an LLM for a health topic in a way that a human healthcare professional could have known to investigate further.
Taking the ChatGPT output at face value, the man in the study bought sodium bromide (which, aside from being a dog epilepsy drug, is also a pool cleaner and pesticide) and poisoned himself over the course of three months to the point of “paranoia and auditory and visual hallucinations.”
Bromism is pretty rare in 2025, but it was huge in the 1800s, and a 1930 study found that up to 8% of people admitted to a psychiatric hospital were suffering from it. Bromide began to be regulated by the FDA between 1975 and 1989, which led to a decline in cases of the syndrome.
The case study says that, “based on the timeline of this case, it appears that the patient either consulted ChatGPT 3.5 or 4.0 when considering how he might remove chloride from this diet.”
On Thursday, in a product launch livestream for ChatGPT 5, OpenAI CEO Sam Altman announced an update he called “the best model ever for health," that could put users “more in control of [their] healthcare journey.” They announced that the new models will use something called “safe completions” in cases where questions might be ambiguous or harmful. Altman also spoke with an employee of the company and his wife, who’d been diagnosed with cancer, about how they had used ChatGPT to understand diagnostic letters, decide whether she would undergo radiation, and help her be “an active participant in her own care journey”.
From 404 Media via this RSS feed