James Cameron has reportedly revealed an anti-AI title card will open up Avatar 3, officially titled Avatar: Fire and Ash. The Oscar-winning director shared the news in a Q&A session in New Zealand attended by Twitter user Josh Harding.

Sharing a picture of Cameron at the event, they wrote: “Such an incredible talk. Also, James Cameron revealed that Avatar: Fire and Ash will begin with a title card after the 20th Century and Lightstorm logos that ‘no generative A.I. was used in the making of this movie’.”

Cameron has been vocal in the past abo6ut his feelings on artificial intelligence, speaking to CTV news in 2023 about AI-written scripts. “I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said – about the life that they’ve had, about love, about lying, about fear, about mortality – and just put it all together into a word salad and then regurgitate it,” he told the publication. “I don’t believe that’s ever going to have something that’s going to move an audience. You have to be human to write that. I don’t know anyone that’s even thinking about having AI write a screenplay.”

  • chemical_cutthroat@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    8 hours ago

    Human creativity, at it’s core, is not original. We smush things together, package it as something new, and in our hubris call it “original” because we are human, and thus infallible originators. Our minds are just electrical impulses that fire off in response to stimuli. There is no divine spark, that’s hogwash. From a truly scientific standpoint, we are machines built with organic matter. Our ones and zeros are the same as the machines we create, we just can’t deal with the fact that we aren’t as special as we like to think. We derive meaning from our individuality, and to lose that would mean that we aren’t individual. However, we are deterministic.

    If you woke up this morning and relived the same day that you already have, and had no prior knowledge of what had happened the previous time you experienced it, and no other changes were made to your environment, you would do the same thing that you did the first time, without fail. If you painted, you would paint the same image. If you ate breakfast, you would eat the same breakfast. How do we know this? Because you’ve already done it. Why does it work this way? Because nothing had changed, and your ones and zeros flipped in the same sequences. There is no “chaos”. There is no “random”. Nothing is original because everything is the way it is because of everything else. When you look at it from that bird’s eye perspective, you see that a human mind making “art” is no different than an LLM, or some form of generative AI. Stimulus is our prompt, and our output is what our machine minds create from that prompt.

    Our “black box” may be more obscure and complex than current technology is for AI, but that doesn’t make it different any more than a modern sports car is different than a Model T. Both serve the same function.

    • oce 🐆@jlai.lu
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      From a truly scientific standpoint, we are machines built with organic matter. Our ones and zeros are the same as the machines we create, we just can’t deal with the fact that we aren’t as special as we like to think. We derive meaning from our individuality, and to lose that would mean that we aren’t individual. However, we are deterministic.

      Would you have some scientific sources about the claim that we think in binary and that we are deterministic?

      I think you may be conflating your philosophical point of view with science.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        34 minutes ago

        All Turing-complete modes of computation are isomorphic so binary or not is irrelevant. Both silicon computers and human brains are Turing-complete, both can compute all computable functions (given enough time and scratch paper).

        If non-determinism even exists in the real world (it clashes with cause and effect in a rather fundamental manner) then the architecture of brains, nay the life we know in general, actively works towards minimising its impact. Like, copying the genome has a quite high error rate at first, then error correction is applied which brings the error rate down to practically zero, then randomness is introduced in strategic places, influenced by environmental factors. When the finch genome sees that an individual does not get enough food it throws dice at the beak shape, not mitochondrial DNA.

        It’s actually quite obvious in AI models: The reason we can quantise them, essentially rounding every weight of the model to be able to run them with lower-precision maths so they run faster and with less memory, is because the architecture is ludicrously resistant to noise, and rounding every number is equivalent to adding noise, from the perspective of the model.

      • chemical_cutthroat@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 hours ago

        The deterministic universe is a theory as much as the big bang. We can’t prove it, but all of the evidence is there. Thinking in binary is me making a point about how our minds interact with the world. If you break down any interaction to its smallest parts, it becomes a simple yes/no, or on/off, we just process it much faster than we think about it in that sense.

        • oce 🐆@jlai.lu
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          There are various independent reproducible measurements that give weight to the hot big bang theory as opposed to other cosmological theories. Are they any for the deterministic nature of humans?
          Quantum physic is not deterministic, for example. While quantum decoherence explains why macro physical systems are deterministic, can we really say it couldn’t play a role in our neurons?
          On a slightly different point, quantum bits are not binary, they can represent a continuous superposition of multiple states. Why would our mind be closer to binary computing rather than quantum computing?

          • chemical_cutthroat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            54 minutes ago

            The comparison between human cognition and binary isn’t meant to be taken literally as “humans think in 1s and 0s” but rather as an analogy for how deterministic processes work. Even quantum computing, which operates on superposition, ultimately collapses to definite states when observed—the underlying physics differs, but the principle remains: given identical initial conditions, identical outcomes follow.

            Regarding empirical evidence for human determinism, we can look to neuroscience. Studies consistently show that neural activity precedes conscious awareness of decisions (Libet’s experiments and their modern successors), suggesting our sense of “choosing” comes after the brain has already initiated action. While quantum effects theoretically could influence neural firing, there’s no evidence these effects propagate meaningfully to macro-scale cognition—our neural architecture actively dampens random fluctuations through redundancy.

            The question isn’t whether humans operate on binary code but whether the system as a whole follows deterministic principles. Even if quantum indeterminacy exists at the micro level, emergence creates effectively deterministic systems at the macro level. This is why weather patterns, while chaotic, remain theoretically deterministic—we just lack perfect information about initial conditions.

            My position isn’t merely philosophical—it’s the most parsimonious explanation given current scientific understanding of causality, neuroscience, and complex systems. The alternative requires proposing special exemptions for human cognition that aren’t supported by evidence.