In my opinion, AI just feels like the logical next step for capitalist exploitation and destruction of culture. Generative AI is (in most cases) just a fancy way for cooperations to steal art on a scale, that hasn’t been possible before. And then they use AI to fill the internet with slop and misinformation and actual artists are getting fired from their jobs, because the company replaces them with an AI, that was trained on their original art. Because of these reasons and some others, it just feels wrong to me, to be using AI in such a manner, when this community should be about inclusion and kindness. Wouldn’t it be much cooler, if we commissioned an actual artist for the banner or find a nice existing artwork (where the licence fits, of course)? I would love to hear your thoughts!

  • patatas@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    3 days ago

    AI is a process, a way of producing, it is not a tool. The assumptions baked into that process are political in nature.

    • Cowbee [he/they]@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      I really don’t follow, something like Deepseek is quite literally a program trained on inputs that spits out an output depending on prompts. It isn’t inherently political, in that its relation to production depends on the social role it plays. Again, a hammer isn’t always capital, it is if that’s the role it plays.

      • patatas@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 days ago

        And that social role is, at least in part, to advance the idea that communication and cognition can be replicated by statistically analyzing an enormous amount of input text, while ignoring the human and social context and conditions that actual communication takes place in. How can that not be considered political?

        • Cowbee [he/they]@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          3 days ago

          The social role of a tool depends on its relation to the overarching mode of production, it isn’t a static thing intrinsic to a tool. AI doesn’t care about advancing any ideas, it’s just a thing that exists, and its use is up to how humans use it. This seems to be all very idealist and not materialist of you.

          • patatas@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 days ago

            If I made a tool which literally said to you, out loud in a tinny computerised voice, “cognitive effort isn’t worth it, why are you bothering to try”, would it be fair to say it was putting forward the idea that cognitive effort isn’t worth it and why bother trying?

            If so, what’s the difference when that statement is implied by the functioning of the AI system?

            • Cowbee [he/they]@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              The existence of AI itself does not imply anything. It’s a tool. The social function of AI is determined by the mode of production.

                • Cowbee [he/they]@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 days ago

                  The AI is not suggesting anything ny virtue of being itself. The social consequences of a given tool depend on the way society is structured, based on the mode of production.

                  • patatas@sh.itjust.works
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    2 days ago

                    I dunno what to tell you other than that I have been consistently pointing out that AI is a process, not a tool.

                    If the result of that process is the same wherever it’s introduced, then your model of the world has to be able account for that.