• TommySoda@lemmy.world
    link
    fedilink
    arrow-up
    38
    ·
    edit-2
    4 days ago

    At first it’s funny. Then it’s cringe. Then it’s funny again. And then it’s absolutely horrifying when you realise this is what some people are actually doing with these chatbots. And I’ve seen people that make fun of those that fall into AI induced psychosis and I find that pretty sad, honestly. As someone that has gone through problems with depression, loneliness, and delusional thoughts in my past I am so thankful that these chatbots did not exist back than. You don’t have to be stupid to fall into this, you just have to be vulnerable. This type of shit can cause permanent damage to someone’s mind and these companies simply do not give a shit about us.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      You don’t have to be stupid to fall into this, you just have to be vulnerable. This type of shit can cause permanent damage to someone’s mind and these companies simply do not give a shit about us.

      I feel the same about being poor. Being poor isn’t a moral failing. And it can happen to just about anyone. It can and most of the time does cause permanent damage to the mind and the body. But there are plenty of people who do view poverty as a moral failing, and help to wage a class war that only profits the rich.

      The difference here is, often, when you are poor you don’t have a choice. People don’t live in poverty by choice. But people often do use LLM’S for therapy and mental health by choice. They have been warned it’s dangerous. But it’s kind of like when you see fire trucks whiz past you on your way home. You hope everyone is okay, but you never think it’s your house that’s on fire.

      And when it inevitably burns them, they often don’t have any support structure in place to help. Which compounds the damage.

      An argument could potentially be made that the people who use them this way don’t have a choice because the healthcare system is so bad in the US, and mental healthcare especially is so unavailable, even to those who have healthcare, but most of these people also recognize on some level that they are nothing more than a product for these companies and these companies are not their friend.

      So even though I generally agree with you, I also recognize that it’s not so cut and dried as “scorning people who use them” is wrong. Because we aren’t going to convince law makers that they need to be regulated, but we can try to dissuade people from using them.

      I don’t think straight up saying "you’re a fool if you use Gen AI LLM’S " does any good, but I can certainly understand why after you’ve told people it’s a bad idea a million times you might get fed up. Human nature.

  • recursive_recursion@piefed.caOP
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    4 days ago

    At 15 mins I was in pure cringe, as it continued into 20 I was laughing my ass off but as it continued I just had this dread and horror seep under my skin. At 30 I had to stop watching.

    Good gods accidental AI psychosis is a real thing man what in the hell.

    • Big_Boss_77@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 days ago

      It was wild… I put it on as background, so the cringe wasn’t as debilitating. The codependent feedback loop that eventually developed was fascinating.

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    3 days ago

    tbh why do people trust giving AI(LLMS) their personal info,cause most of them like ChatGPT train on the data.
    i dont know if this is off topic.

    • Fizz@lemmy.nz
      link
      fedilink
      arrow-up
      1
      ·
      20 hours ago

      A lot of people dont actually care if their information is out there. They know they are in lists that have where they live how old they are whay they like etc. They dont care at all about data collection or privacy.

      Its hard to understand as someone who cares and tries to mitigate it.