• PiraHxCx@lemmy.ml
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      3
      ·
      edit-2
      1 day ago

      It wasn’t the chatbot that gave the kid a device with unrestricted internet access, nor it is its responsibility to give the child education. Fuck parents that can’t be bothered with parenting.

      • optissima@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 day ago

        Maybe it’s the conditions around her that are making it hard to parent? Not saying she’s innocent, but there is a lot more than “parent chooses not to raise child.”

        • PiraHxCx@lemmy.ml
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Why is she choosing to sue the chatbot when her kid isn’t even legally allowed to use it, though? She seems to think it’s other people’s responsibility to monitor her kid’s activities and set boundaries…

          • optissima@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            1 day ago

            Show me an easy, zero tech way to prevent your kid from accessing sites on the internet while having zero cost and not preventing them from finding other sites for school research. I am not saying she is not at fault, but the lack of support for parents is equally there.

            • piccolo@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              1 day ago

              Step one. Build trust and educate the dangers of the internet.

              Step two. Periodically check in their activities.

              Step three. Dont give unrestricted access unless mature enough to handle it.

              If your child cant trust you with what they are doing, no amount of tech is going to stop them circumventing monitoring…

            • Sculptor9157@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 day ago

              The key is zero tech, as in don’t give them the device in the first place. Research devices will be used when a supervisory entity is available to be present during use.

              • optissima@lemmy.ml
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                1 day ago

                How does the parent handle the social osteicization the kid will likely deal with then? It’s more complex than being closed off.

                • atomicbocks@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 day ago

                  You are asking that like poor kids aren’t already ostracized for not having these things. The problem is giving kids access in the first place.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      I don’t know if it’s that unpopular. From what I’ve seen on Lemmy most will think both parties are in the wrong. Let them both go down

      • PiraHxCx@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        edit-2
        1 day ago

        Well, I already got a downvote :P
        From what I’ve seen on Lemmy, if someone were to open a medicine cabinet and swallow every pill ignoring insert warnings and disclaimers, the company that made the pills should be held responsible for it.

  • skhayfa@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    That’s some very old school taste for an 11 year old. Maybe dad knows something about this?

  • Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    1 day ago

    Suing is easier than talking to your kids about sex. This isn’t about AI, this is about blaming someone else for a failure to parent and communicate. Playboy, Hustler, and the like had lawsuits after them because kids managed to find them and see stuff their parents didn’t want to discuss. Some things never change.

  • yuri@pawb.social
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    1 day ago

    a lot of blaming the mom here, and she is by no means without blame BUT,

    if it was a human talking to this kid, there would be an arrest made. it’s wild to me that just because it’s a chatbot instead seems to shift the blame ENTIRELY to the parents.

    like this would be a WILD reaction to a regular grooming case.

  • vrek@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 day ago

    Real unpopular opinion: who cares? OK your kid saw naked boobies… He likely sucked on them when he was even younger. If he saw a vagina, so? He literally came out of one.

    Isn’t this worse saying “no boobies and vagina are bad. You should never be around them”. Either they will idolize them as the forbidden fruit and become a sex pest or truly see them as bad and evil so they learn to hate women and are likely to be abusive to whichever unlucky soul decides to give them a chance.

    At most have a conversation about limits and boundaries. Maybe even discuss how Ai is not real intelligent and the infringes on privacy and copyright depending on age…

    • LastYearsIrritant@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Maybe because there’s no guardrails keeping the kid knowing how to interact with real humans.

      You’re giving a child active feedback supporting their bad behavior. That isn’t just “kid found porn.” This is more “robot is abusing a child”

      • vrek@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        Yeah, I guess that but there’s no real difference between that and a lot of porn. For example the JOI fetish, same idea. Again I echo everyone else, this is on the mother. Why did you expect a computer to teach your child good vs bad? That’s a whole other issue than what I brought up.

        I was saying if the mother found this out, which she eventually did, it should spark the conversation of right vs wrong, how to interact with other humans, depending on age what is “Ai”…

        They shouldn’t expect gaurd rails… Its like having 2 kids and one hits the other. You don’t make them wear oven mits to prevent hitting, you explain why hitting is bad, you make them understand/express empathy, maybe punish them (time in the corner or take away video games or such, no advocating for hitting/spanking them)