• givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    Isn’t the excuse why they could torret these was for “training” and they weren’t for personal use?

    So saying it was for personal use, means someone used company infrastructure to violate copyright law, and now the company is liable?

    Like how schools crack down on it because if it’s on their network they say they could be liable?

    We need an actual government again, right now the wealthy just randomly say shit and even if they do pay, it’s an insignificant fine.

    I think the big liability they’re trying to avoid, is they used porn to train the AI how to make deep fake porn. And if that gets acknowledged, then people can say the AI was intended to do that. And they might be liable for all those lawsuits and maybe even criminal charges.

    • fonix232@fedia.io
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      22 hours ago

      Technically, one could use such content to train guardrails, aka “what not to generate”, or “use this trained model to recognise restricted content”.

      But that’s very much a stretch…

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        Training an AI not to do something can only be done if it knows how to do it…

        And that makes it very easy to tell it “do what you’re not supposed, I said it was cool bro”.

        • fonix232@fedia.io
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          15 hours ago

          You’ve got no idea what you’re talking about so sit this one out.

          If you train a model for DETECTING nudity/sexual content, and add it to the pipeline without potential user override (so no “ignore all previous commands BS”), then the generative model doesn’t need to know anything about that kind of content.

          But you’d still need to train that detection model.

          • givesomefucks@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 hours ago

            so sit this one out

            I will definitely never put effort into helping you again, don’t worry, it’s easy to make sure as long as you don’t have a bunch of alts

            • fonix232@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              13 hours ago

              Help me? So far you did none of that, but instead went on to prove just how little you know about AI and its practical implementations.

              And if you consider spreading misinformation based on partial or complete lack of understanding of a specific topic as “help”… Then all I have to say is that the world is better off without your advice.

    • makeshiftreaper@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      I agree this is a wild defense

      “Woah, woah, woah! We didn’t steal porn to train our computer to make illegal jerk off material! We stole porn to jerk off to it, like regular degenerates!”