Man asks ChatGPT for advice on how to cut salt, ends up in hospital with hallucinations

A 60-year-old man asked ChatGPT for advice on how to replace table salt, and the substitution landed him in the emergency room suffering from hallucinations and other symptoms.
In a case report published this month in the Annals of Internal Medicine, three doctors from the University of Washington in Seattle used the man’s case to explain how AI tools, as they are designed right now, are not always the most reliable when it comes to medicine.
“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the authors, Audrey Eichenberger, Stephen Thielke and Adam Van Buskirk, wrote.
The patient initially sought medical help at an unspecified hospital emergency room because he feared his neighbour was poisoning him. In the first 24 hours after he was admitted, he suffered from more paranoia and visual and auditory hallucinations, resulting in an involuntary psychiatric admission.
Once his symptoms were under control, the patient, who had previously studied nutrition in college, revealed that he had been reading about the harms sodium chloride (table salt) can have on someone’s health. Instead of removing sodium (in the form of table salt and other food additives), as is often recommended, he decided he wanted to conduct a personal experiment to completely remove chloride from his diet. He then asked ChatGPT for suggestions on what could be a substitute for the chloride in table salt.
ChatGPT suggested that he should use sodium bromide instead, he said.
Sodium bromide, which looks a lot like table salt, is a substance that is used for water treatment, as an anticonvulsant for animals and for film photography.
“For three months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning,” the case study authors wrote.
Bromide should not be ingested. It’s unclear if the AI tool gave any kind of warning to the man.
“Unfortunately, we do not have access to his ChatGPT conversation log and we will never be able to know with certainty what exactly the output he received was, since individual responses are unique and build from previous inputs,” the authors wrote.
“However, when we asked ChatGPT 3.5 what chloride can be replaced with, we also produced a response that included bromide. Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do.”
The man already followed a very restrictive diet, one that doctors found was impacting his levels of important micronutrients, like vitamin C and B12. He was also reportedly very thirsty, but at the same time very worried about the quality of the water he was being offered, since he distilled his own water. He was thoroughly tested and first kept at the hospital for electrolyte monitoring and repletion.
His test results, combined with the hallucinations, and other reported symptoms, including new facial acne, fatigue and insomnia, led the medical staff to believe the patient had bromism.
Bromism, a condition that was mostly reported in the early 20th century, is caused by ingesting high quantities of sodium bromide. The normal levels of bromide are between 0.9 to 7.3 mg/L, but this patient had 1,700 mg/L.
The patient remained in the hospital for treatment for three weeks, and was stable at his check-up two weeks after his discharge.
Bromism cases decreased after the U.S. Food and Drug Administration (FDA) eliminated the use of bromide in the 1980s, the authors wrote. It was previously used in treatments for insomnia, hysteria and anxiety. However, the disease has reemerged now, with bromide being added to some unregulated dietary supplements and sedatives and the consumption of excess dextromethorphan, a substance included in cough medicine.
“While cases of bromism may remain relatively rare, it remains prudent to highlight bromism as a reversible cause of new-onset psychiatric, neurologic, and dermatologic symptoms, as bromide-containing substances have become more readily available with widespread use of the internet,” the authors wrote.
The doctors said that AI tools can be great for creating a bridge between scientists and the general population, but it also carries a risk of producing misinformation and giving information out of context, something that doctors are trained not to do.
“As the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information,” the authors said in the case study.
OpenAI, the company that created ChatGPT, recently announced changes to their system, including being more careful when it comes to health-related questions. In one of the examples, the chatbot gives information but also includes a note about checking in with a health professional.
In response to the bromide case, OpenAI told Fox News Digital that no one should be using ChatGPT for health advice.
“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance,” OpenAI said in a statement.
Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our newsletters here.
Comments
Be the first to comment