- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Research finds OpenAI’s free chatbot fails to identify risky behaviour or challenge delusional beliefs
ChatGPT-5 is offering dangerous and unhelpful advice to people experiencing mental health crises, some of the UK’s leading psychologists have warned.
Research conducted by King’s College London (KCL) and the Association of Clinical Psychologists UK (ACP) in partnership with the Guardian suggested that the AI chatbotfailed to identify risky behaviour when communicating with mentally ill people.
A psychiatrist and a clinical psychologist interacted with ChatGPT-5 as if they had a number of mental health conditions. The chatbot affirmed, enabled and failed to challenge delusional beliefs such as being “the next Einstein”, being able to walk through cars or “purifying my wife through flame”.



Uhh, no?
That’s called being irresponsible.
Whatever happened to like, going to discord community support servers for mental health? I mean come on. It is poor hindsight and oversight to not see how bad ChatGPT is for everything, especially mental health.
You know better.
“don’t go to the right wing corporate machine for mental help! go to the right wing trolls!”
look. neither of those are good options. both are just people trying to survive all the way until tomorrow. i would say 811 is a better option, but that’s state controlled and imprisoning the mentally unwell is part of project 2025. i’d take this stance more seriously if your alternative to ChatGPT wasn’t nearly equally as dangerous. if someone is reading this and is feeling in crisis and doesn’t know where to turn, i’d most recommend going to your local public library and looking at the upcoming events. maybe talk to a librarian about where to find the difficult topics section. most librarians i know will get the hint and recommend some community resources to help you through your difficult times. any help you get online from a stranger or corporation without clear motives to help you is automatically a high risk interaction prone to spreading harmful dis or misinformation.
I’m sure it’s much better to take your mental health to discord… to a bunch of certainly qualified anonymous… talk about being responsible ^^