What a shitty fluff article. You can only see the details once you subscribe to their shitty site.
TL;DR: Guy asked chatgpt what you can replace chloride with in salt, and it thought he meant chemistry, so it told him other halides like bromide. It did NOT recommend him to eat it. He inferred that on his own, and then gave himself a psychiatric illness by eating buckets of it.
This is an example of an unfortunate illness caused by human stupidity.
Don’t blame human stupidity on LLMs.
Please. Please! Stop using LLMs as therapists. Talking to a poorly-drawn coconut with a face would be a healthier alternative. The reality is that, for those who do not have access to real therapy, receiving no therapy is objectively more likely to have positive outcomes than using an LLM.
How long before chatgpt brings back Phrenology?