• ByteJunk@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    4 hours ago

    The article is extremely poor on the details, it doesn’t go into what specific part GPT is alleged to have played in the suicide, or if the parents were aware of the guys mental state, if they did anything or just ignored it, etc.

    I’ll just grab a chair on this one until we know more.

    • Log in | Sign up@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      I feel like you didn’t read to the bottom of the article.

      Chat GPT answered his questions about how to go about it, something almost all news providers agree not to ever do.

      Chat GPT discouraged him from telling his mum about how he felt.

      When he talked to Chat GPT about leaving the noose in his room to be found so they knew how he felt, it advised him not to.

    • dude@lemmings.worldOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 hours ago

      what specific part GPT is alleged to have played in the suicide

      The lawsuit says ChatGPT reassured and normalized suicidal ideation by telling Adam that many people find comfort in imagining an “escape hatch,” which the complaint argues pulled him “deeper into a dark and hopeless place.”: TIME

      And the complaint also alleges that ChatGPT offered to help write a suicide note shortly before his death: reuters

      or if the parents were aware of the guys mental state

      Coverage indicates the family knew Adam had anxiety and recent stressors (loss of a grandmother and a pet, removal from the basketball team, a health flare-up leading to online schooling), but were unaware he was planning self-harm through chatbot conversations. TIME again