James Cameron has reportedly revealed an anti-AI title card will open up Avatar 3, officially titled Avatar: Fire and Ash. The Oscar-winning director shared the news in a Q&A session in New Zealand attended by Twitter user Josh Harding.
Sharing a picture of Cameron at the event, they wrote: “Such an incredible talk. Also, James Cameron revealed that Avatar: Fire and Ash will begin with a title card after the 20th Century and Lightstorm logos that ‘no generative A.I. was used in the making of this movie’.”
Cameron has been vocal in the past abo6ut his feelings on artificial intelligence, speaking to CTV news in 2023 about AI-written scripts. “I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said – about the life that they’ve had, about love, about lying, about fear, about mortality – and just put it all together into a word salad and then regurgitate it,” he told the publication. “I don’t believe that’s ever going to have something that’s going to move an audience. You have to be human to write that. I don’t know anyone that’s even thinking about having AI write a screenplay.”
James Cameron doesn’t do what James Cameron does for James Cameron. James Cameron does what James Cameron does because James Cameron is James Cameron!
The bravest pioneer
Reminds me of seeing a disclaimer in the credits of animated movies saying no Motion Capture was used
“It’s ok, only the nice computer simulations helped us make this movie, not the bad ones!”
2030’s gonna be interesting
Unfortunately, the same can’t be said for the 4K transfers of Aliens, True Lies, and The Abyss.
But I applaud the efforts nonetheless.
AI upscaling isn’t the same thing as generative AI.
One makes the image a larger resolution just shifting pixels around. The other can create entirely new scenes.
I’m not sure if you’re familiar with the AI upscales mentioned, but AI did a lot more than just “making it a larger resolution”. It fundamentally altered and degraded certain visual aspects of the films.
They used a shitty upscaler and got shitty results. Color me not surprised.
Nerrel mentioned 🌚🌚
That’s great, but don’t forget to make it not suck ass. When a movie sucks ass, it’s not fun to watch it. Like Avatar 2? That sucked ass. We waited longer than the Titanic was underwater for a sequel that was as warm as the water where Titanic rests. That sucks ass
I’m not sure you understand how time works.
You don’t understand facetious comments.
Avatar 2: Bro edition
dedicated to the brave avatar animators
So would his stance change if we move past basic llms and have models that can generate coherent innovative ideas that were not learned?
‘Would his opinion of the technology be different if the technology was different?’
Indeed, is his opinion based on the way the current technology works by regurgitating, or is it based on the loss of creative jobs?
Only when we can accurately point to any one idea that a human has had that hasn’t been a product of previous information.
With historian work, I think it’s possible to say this idea appeared at about this point in time and space, even if it was refined by many previous minds. For example, you can tell about when an engineering invention or an art style appeared. Of course you will always have a specialists debate about who was the actual pioneer (often influenced by patriotism), but I guess we can at least have a consensus of when it starts to actually impact the society.
Also, maybe we can have an algorithm to determine if a generated result was part of the learning corpus or not.But the idea is never original. The wheel likely wasn’t invented randomly, it started as a rock that rolled down a hill. Fire likely wasn’t started by a caveman with sticks, it was a natural fire that was copied. Expressionism wasn’t a new style of art, it was an evolution that was influenced by previous generations. Nothing is purely original. The genesis of everything is in the existence of something else. When we talk about originality, we mean that these things haven’t been put together this exact way before, and thus, it is new.
I don’t disagree with your definition, but I’m not sure what it changes in the point of current LLMs lacking human creativity. Do you think there isn’t anything more than a probabilistic regurgitation in human creativity so LLM already overcome human creativity, and it’s just a matter of consideration?
I agree that humans are just flesh computers, but I don’t know whether we can say LLMs have overcome human creativity because I think the definition is open to interpretation.
Is the intentionality capable only with metacognition a requirement for something to be art? If no, then we and AI and spiders making webs are all doing the same “creativity” regardless of our abilities to consider ourselves and our actions.
If yes, then is the AI (or the spider) capable of metacognition? I know of no means to answer that except that ChatGPT can be observed engaging in what appears to be metacognition. And that leaves me with the additional question: What is the difference between pretending to think something and actually thinking it?
In terms of specifically “overcoming” creativity, I don’t think that kind of value judgement has any real meaning. How do you determine whether artist A or B is more creative? Is it more errors in reproduction leading to more original compositions?
As I suggested above, I would say creating a coherent idea or link between ideas that was not learned. I guess it could be possible to create an algorithm to estimate if the link was not already present in the learning corpus of an ML model.
Human creativity, at it’s core, is not original. We smush things together, package it as something new, and in our hubris call it “original” because we are human, and thus infallible originators. Our minds are just electrical impulses that fire off in response to stimuli. There is no divine spark, that’s hogwash. From a truly scientific standpoint, we are machines built with organic matter. Our ones and zeros are the same as the machines we create, we just can’t deal with the fact that we aren’t as special as we like to think. We derive meaning from our individuality, and to lose that would mean that we aren’t individual. However, we are deterministic.
If you woke up this morning and relived the same day that you already have, and had no prior knowledge of what had happened the previous time you experienced it, and no other changes were made to your environment, you would do the same thing that you did the first time, without fail. If you painted, you would paint the same image. If you ate breakfast, you would eat the same breakfast. How do we know this? Because you’ve already done it. Why does it work this way? Because nothing had changed, and your ones and zeros flipped in the same sequences. There is no “chaos”. There is no “random”. Nothing is original because everything is the way it is because of everything else. When you look at it from that bird’s eye perspective, you see that a human mind making “art” is no different than an LLM, or some form of generative AI. Stimulus is our prompt, and our output is what our machine minds create from that prompt.
Our “black box” may be more obscure and complex than current technology is for AI, but that doesn’t make it different any more than a modern sports car is different than a Model T. Both serve the same function.
From a truly scientific standpoint, we are machines built with organic matter. Our ones and zeros are the same as the machines we create, we just can’t deal with the fact that we aren’t as special as we like to think. We derive meaning from our individuality, and to lose that would mean that we aren’t individual. However, we are deterministic.
Would you have some scientific sources about the claim that we think in binary and that we are deterministic?
I think you may be conflating your philosophical point of view with science.
if it’s not hand animated on 1s I care about as much as the last 2
Why should I even care?
You clearly care enough to leave a comment.