• rolling@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    4 days ago

    Sorry, I made the comment about being on Fuck AI because of your edit to the original message. I wasn’t trying to accuse you of anything.

    Back to the AI stuff. I am sorry if I am a little sceptical about your claims about the “next generation of AI” and how “soon” they will outperform humans when even after all these years, money and energy poured into them, they still manage to fuck up a simple division question. Good luck making any model that needs to be trained on data perfect at this point, because AI slop that has been already generated and released in to the internet has already took care of that. Maybe we will have AGI at some point, but I will believe that when I actually see it.

    Finally, I don’t know about modern art being absurdly simplistic. How can you look at modern animation or music and call it absurdly simplistic. How can you look at thousands of game UI designs in Edd Coates’ website and call them absurdly simplistic? All AI will ever create when it comes to art is some soulless amalgamation of what it has seen before, it will kill all creativity, originalty and personality from art, but businessman in suits will gladly let it take over human artists because it is cheaper then hiring human artists and designers.

    • Scubus@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      4 days ago

      Yeah, i definitely get that. I suspect there will soon be techniques for sanitizing training data, although that just makes unethical capture more easy. And assuming the final goal is sentience, im not entirely sure it is unethical to train of others peoples data as long as you control for overfitting. The reasoning being that humans do the exact same thing. We train on every piece of media weve ever seen and use that to inspire “new” forms of media. Humans dont tend to have original thoughts, we just reshasg what weve heard. So every time you see a piece of media, you quite literally steal it mentally. It’s clearly a different argument with modern AI, I’m not claiming it does the same thing. But its main issue when it comes to that seems to be overfitting, too much of it’s inspiration can be directly seen. Sometimes it comes off as simply copying an image that was in its training data. Thats not inspiration, thats plagirism.

      And yeah i tend to assume were going to kill off capitalism because if we dont this discussion isnt going to matter anyways