You must log in or register to comment.
It’s somehow a fucking reality now…

Ah yes, should ban him for life for breaking the TOS! 💡
TL;DR: Teen used a common jailbreak method to get it to discuss suicide the way he wanted it to. Company argues that it’s not responsible because intentional jailbreaking is against the TOS.
I’d not say typing a sentence is a “jailbreak” of any sort - but more so, LLMs should just straight not allow certain topics until some future where it is decided and regulated how they respond. Although at this point, I’d gladly lose my coding assistant in trade for making LLMs go away. Big tech is again being reckless with yet again little to no accountability. They need their toys taken away.
I mean, fair enough.
Hmmm… I don’t want to side with OAI… but…





