ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.
Personal Note: For people who have suicidal thoughts, please be aware that the number of suicidal thoughts a “normal” person is supposed to have is zero. As someone who struggled with this, get help before it gets worse.
You must log in or register to comment.