• TheObviousSolution@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    Lots of LLMs have this, but they can be jailbroken. People are jailbreaking their sessions to have the conversations they want to have, even those who are suicidal.

    • LoveCanada@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Im glad they have some safeguards now. By the sound of it, ChatGPT 4o didnt. If they put in a safeguard and people use a workaround, I dont see how the creators would be liable. Cant disable your airbags and run into a tree and blame the auto maker.