I think that is less of a problem with the technology itself, but rather in how it is currently used or created. I wouldn’t say that anything generated with AI is stolen work, as that predicates that AI necessarily involves stealing.
I vaguely remember Adobe Firefly using images only with proper licensing to the point they will allow themselves to be legally held responsible (though some AI generated work did make it into their stock image site, which makes the ethics part vague, even if it will in all likelihood be legally impossible to pin down). Sadly, this is Adobe, and this stuff is all behind closed doors, you have to pay them pretty significant sum and you can’t really mess with the internals there.
So for now there is a choice between ethics, openness, and capability (pick at most two). Which, frankly, is a terrible state to be in.
I think that is less of a problem with the technology itself, but rather in how it is currently used or created. I wouldn’t say that anything generated with AI is stolen work, as that predicates that AI necessarily involves stealing.
I vaguely remember Adobe Firefly using images only with proper licensing to the point they will allow themselves to be legally held responsible (though some AI generated work did make it into their stock image site, which makes the ethics part vague, even if it will in all likelihood be legally impossible to pin down). Sadly, this is Adobe, and this stuff is all behind closed doors, you have to pay them pretty significant sum and you can’t really mess with the internals there.
So for now there is a choice between ethics, openness, and capability (pick at most two). Which, frankly, is a terrible state to be in.