• melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    30 minutes ago

    I am not a musk fan but jeezus pleezus…If I bought a Henkel’s chef knife and killed someone would the headline be “Henkel Killed Someone”? We need to rid ourselves of big media and bring independent journalism back. The ones who value facts and try to get as close to the truth as possible, not the “hate clicks for money” cabals.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    3
    ·
    edit-2
    10 hours ago

    OK. Very hot take.

    …Computers can produce awful things. That’s not new. They’re tools that can manufacture unspeakable stuff in private.

    That’s fine.

    It’s not going to change.

    And if some asshole uses it that way to damage others, you throw them in jail forever. That’s worked well enough for all sorts of tech.


    The problem is making the barrier to do it basically zero, automatically posting it to fucking Twitter, and just collectively shrugging because… what? Social media is fair discourse? That’s bullshit.

    The problem is Twitter more than Grok. The problem is their stupid liability shield.

    Strip Section 230, and Musk’s lawyers would fix this problem faster than you can blink.

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      10 hours ago

      Without 230 its pretty clear that the web would die a messy death where people like Elon would be the only ones who could afford to offend anyone due to anyone with a few hundred or a few thousand being able to make you spend 30 or 40k defending themselves.

      Got any other stupid ideas?

  • ryven@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    edit-2
    13 hours ago

    Why is the headline putting the blame on an inanimate program? If those X posters had used Photoshop to do this, the headline would not be “Photoshop edited Good’s body…”

    Controlling the bot with a natural-language interface does not mean the bot has agency.

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 hours ago

      You can do this with your own AI program which is complicated and expensive or Photoshop which is much harder but you can’t with openAI.

      Making it harder decreases bad behaviour so we should do that.

    • Sconrad122@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      12 hours ago

      I don’t know the specifics for this reported case, and I’m not interested in learning them, but I know part of the controversy with the grok deep fake thing when it first became a big story was that Grok was starting to add risqué elements to prompted pictures even when the prompt didn’t ask for them. But yeah, if users are giving shitty prompts (and I’m sure too many are), they are equally at fault with Grok’s devs/designers who did not put in safeguards to prevent those prompts from being actionable before releasing it to the public

      • GhostPain@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        12 hours ago

        Sure, and that’s bad too.

        But Elon isn’t making those pervs do pervy things. He’s just making it easier.

        • a_non_monotonic_function@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          11 hours ago

          Sure, and that’s bad too.

          If being a producer of child pornography earns a “sure, that’s bad too” I’m going to just assume you are cool with the whole thing.

          The person producing the child porn is infinitely worse.

          • GhostPain@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            4
            ·
            10 hours ago

            You got lost buddy. “Sure that’s bad too” was referring to the makers of Grok.

            I point of fact my first response was exactly about the predators using Grok.

            Hope this helps.

            • a_non_monotonic_function@lemmy.world
              link
              fedilink
              arrow-up
              5
              arrow-down
              1
              ·
              10 hours ago

              Nope, I got it, I just am now realizing there is something very wrong with you.

              You are letting the cp peddler off lighter than it’s users. That means that you are morally deficient. That is the point I was trying to make earlier.

              Hope this helps.

      • GhostPain@lemmy.world
        link
        fedilink
        arrow-up
        10
        arrow-down
        3
        ·
        edit-2
        15 hours ago

        Right, because they pull their own triggers, etc., etc…

        Edit: To clarify to make absolutely certain you can’t possibly think I’m agreeing with you, I’m not complaining about Grok, although chatbot “AI” are stupid fucking tools for tools, I’m comlaining about the PREDATORS who use chatbots to do sexual deviancy shit. Not unlike the ammosexuals/ICE agents who fantasize about using their ARs to kill women.

        • very_well_lost@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          15 hours ago

          I guess I misinterpreted your comment. I thought you were trying to imply that all of the responsibility rested on the badly-behaving users and that the makers of Grok had no responsibility to prevent sexual harassment and CSAM on their platform.

          • GhostPain@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            4
            ·
            13 hours ago

            The makers of Grok are responsible for all of its sins, yes.

            But mostly I was referring to the men who use software like this for those purposes.

            But this is nothing new. I’m old enough that I predate even using Photoshop for things like this.

            No, the problem is always perverted men.

            And for the incels, no, women don’t do shit like that.

            • very_well_lost@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 hours ago

              I don’t disagree. But my point is that, if creepy, predatory men are so ubiquitous, so shameless, and such a known factor (which anyone with half a brain should agree they are) then isn’t the fastest and most efficient way to minimize harm removing the tools of abuse from the hands of the abusers?

  • fizzle@quokk.au
    link
    fedilink
    English
    arrow-up
    8
    ·
    15 hours ago

    “We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary,” X’s “Safety” account claimed that same day.

    It really sucks they can make users ultimately responsible.

    • GhostPain@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      12 hours ago

      And yet they leave unfettered access to the tool that makes it possible for predators to do such vile shit.

  • FriendOfDeSoto@startrek.website
    link
    fedilink
    English
    arrow-up
    5
    ·
    16 hours ago

    In terms of how this is reported, at what point does this become streisanding by proxy? I think anything from the Melon deserves to be scrutinized and called out for missteps and mistakes. At this point, I personally don’t mind if the media is overly critical about any of that because of his behavior. And what I’m reading about Grok is terrible and makes me glad I left Twitter after he bought it. At the same time, these “put X in a bikini” headlines must be drawing creeps towards Grok in droves. It’s ideal marketing to get them interested. Maybe there isn’t a way to shine the necessary light on this that doesn’t also attract the moths. I just think in about ten years’ time we will get a lot of “I started undressing people against their will on Grok and then got hooked” defenses in court rooms. And I wonder if there would’ve been a way to report on it without causing more harm at the same time.