In this analogy, the AI uses books like a remix DJ would use bits and pieces of songs from different tracks to splice together their output. Except in the case of AI, it will be much harder to identify the original source.
Have you never used bits and pieces of what other people say or what you’ve read in books or riffs you’ve heard or styles seen/heard/read when communicating or creating?
Of course. But I’m not a machine churning out an endless spew of those bits and pieces with no further creative input. I’d be on the side of giving any truly conscious entity rights (including creative ones), but LLMs are not, and I don’t think ever could be, conscious. That’s just not how they work, to my understanding anyway.
If LLMs aren’t conscious, who is using them to churn out an endless spew of those bits and pieces with no further creative input?
Someone has to be doing it. I guess it could be these newfangled AI Agents I’ve been hearing about, but as far as at least I’m aware, they still require input and/or editing (depending on the medium) from a human.
Okay let’s take a break here cuz I think we need to point something out. They are absolutely not conscious. By any definition of the word. By any stretch of the imagination. It’s important to me that you understand this. What you are describing here is a tool. Not something with consciousness.
If I can read books and learn, why can’t AI?
Just because you own a cd doesn’t mean you have a license to play it in a club.
Since when can’t you use knowledge gained from books for personal profit?
The only difference is scale.
It’s a good thing they are not playing at a club then.
In this analogy, the AI uses books like a remix DJ would use bits and pieces of songs from different tracks to splice together their output. Except in the case of AI, it will be much harder to identify the original source.
Under this definition, it is illegal summarize news articles behind a paywall.
If you made money doing that, it probably would be illegal. You would certainly get sued, in any case.
People make a lot of money summarizing articles behind paywalls and it is generally considered legal as long as it is a summary and not copied text.
Who are you paying for that?
You don’t have to pay for fair use.
Have you never used bits and pieces of what other people say or what you’ve read in books or riffs you’ve heard or styles seen/heard/read when communicating or creating?
Of course. But I’m not a machine churning out an endless spew of those bits and pieces with no further creative input. I’d be on the side of giving any truly conscious entity rights (including creative ones), but LLMs are not, and I don’t think ever could be, conscious. That’s just not how they work, to my understanding anyway.
If LLMs aren’t conscious, who is using them to churn out an endless spew of those bits and pieces with no further creative input?
Someone has to be doing it. I guess it could be these newfangled AI Agents I’ve been hearing about, but as far as at least I’m aware, they still require input and/or editing (depending on the medium) from a human.
Okay let’s take a break here cuz I think we need to point something out. They are absolutely not conscious. By any definition of the word. By any stretch of the imagination. It’s important to me that you understand this. What you are describing here is a tool. Not something with consciousness.
I completely agree. Reread what I wrote with that in mind, keeping in mind the context of the comment I replied to.
…because you are a person, not a product.
LLMs (currently colloquially “AI”) are literally incapable of “learning.”
They are likely referring to the training process of populating model weights based on prepared datasets via training algorithms.
LLMs are not sentient and never can be.
You bought it, you own it.
But you can’t make copies of it and sell them.