• BootyEnthusiast@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    18
    ·
    12 hours ago

    TL;DR: Teen used a common jailbreak method to get it to discuss suicide the way he wanted it to. Company argues that it’s not responsible because intentional jailbreaking is against the TOS.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      I’d not say typing a sentence is a “jailbreak” of any sort - but more so, LLMs should just straight not allow certain topics until some future where it is decided and regulated how they respond. Although at this point, I’d gladly lose my coding assistant in trade for making LLMs go away. Big tech is again being reckless with yet again little to no accountability. They need their toys taken away.