This doesn’t work outside of laboratory conditions.
It’s the equivalent of “doctors find cure for cancer (in mice).”
I like that example, everytime you hear about some discovery that x kills 100% of cancer cells in a petri dish. You always have to think, so does bleach.
Nice, maybe we should try injecting bleach. I heard it also cures Covid!
You ever heard of Miracle Mineral Solution? It’s bleach with extra steps and some of the 5G loonies give their autistic kids enemas with it to drive out the “toxins” giving their kids autism.
It hasn’t worked much outside of the laboratory, because they just released it from the laboratory. They’ve already proven it works in their paper with about 90% effectiveness.
Yeah I wouldn’t take this number at face value, let’s wait for some real world usage
It’s clever really, people who don’t like ai are very lonelye to also not understand the technology, if you’re going to grift then it’s a perfect set of rubes - tell them your magic code will defeat the evil magic code of the ai and that’s all they need to know, fudge some numbers and they’ll throw their money at you
deleted by creator
Its clever really, the people who hate protecting your art from usage you dont approve of are very likely to not understand the technology, if youre going to mock them theyre the perfect set of rubes.
What’s not clever is making stuff up to not really make a point after typing a whole paragraph lmao
It’s not FOSS and I don’t see a way to review if what they claim is actually true.
It may be a way to just help to diferentiate legitimate human made work vs machine-generated ones, thus helping AI training models.
Can’t demostrate that fact neither, because of its license that expressly forbids sofware adaptions to other uses.
Edit, alter, modify, adapt, translate or otherwise change the whole or any part of the Software nor permit the whole or any part of the Software to be combined with or become incorporated in any other software, nor decompile, disassemble or reverse engineer the Software or attempt to do any such things
The EULA also prohibits using Nightshade “for any commercial purpose”, so arguably if you make money from your art—in any way—you’re not allowed to use Nightshade to “poison” it.
This is the part most people will ignore but I get that’s it’s mainly meant for big actors.
I read the article enough to find that the Nightshade tool is under EULA… :(
Because it definitely is not FOSS, use it with caution, preferably on a system not connected to internet.
Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.
And as I said there, it is utterly hypocritical for him to sell snake oil to artists, allegedly to help them fight copyright violations, while committing actual copyright violations.
Pardon my ignorance but how do you steal code if it’s open source?
He took GPLv3 code, which is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of GPLv3.
You don’t follow the license that it was distributed under.
Commonly, if you use open source code in your project and that code is under a license that requires your project to be open source if you do that, but then you keep yours closed source.
I still wouldn’t call it stealing, but I guess “broke open source code licenses” doesn’t have the same impact, but I’d prefer accuracy.
It’s piracy, distributing copyrighted works against the terms of its license. I agree stealing is not really the right word.
Ah, another arms race has begun. Just be wary, what one person creates another will circumvent.
The tool’s creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.
That’s not something a technical solution will work for. We need copyright laws to be updated.
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
A few quotes:
and
Yeah, that’s what I’m saying - our current copiright laws are insufficient to deal with AI art generation.
They aren’t insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn’t be trying to make that any worse.
Yep. Copyright should not include “viewing or analyzing the picture” rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don’t want your art seen by a person or an AI then don’t publish it.
Copyright should absolutely include analyzing when you’re talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.
What artists are asking for is ethical sourcing for AI datasets. We’re talking paying a licensing fee or using free art that’s opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist’s permission. You can’t just take songs and make your own synthesizer out of them, then sell it. If you want music for something you’re making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That’s what artists want.
When an artist, who does art for a living, posts something online, it’s an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn’t want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what’s left? Just AI art made as a derivative of AI art that was made as a derivative of other art.
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists’ work. The derivative works infringement is already happening right out in the open.
Begun, the AI Wars have.
I didn’t have that on my 2020 bingo card, but it has been a very long year so everything is possible.
Excited to see the guys that made Nightshade get sued in a Silicon Valley district court, because they’re something something mumble mumble intellectual property national security.
They already stole GPLv2 code for their last data poisoning scheme and remain in violation of that license. They’re just grifters.
Won’t this thing actually help the AI models in the long run? The biggest issue I’ve heard is the possibility of AI generated images getting into the training dataset, but “poisoned” artworks are basically guaranteed to be of human origin.
Unless you intentionally poison AI generated images and add them to circulation, which is not hard to do nor a great leap of logic to do if you hate AI
Better poison everything, then
As an artist, nightshade is not something I will ever use. All my art is public domain, including AI. Let people generate as many pigeon pictures as they want I say!
That’s great for you, truly it is, but for others it’s not.
Mind explaining what artists it isn’t good for? I genuinely don’t see why it is so hard to let others remix and remake.
Believe it or not I need to eat food. Crazy I know.
Oh hey nice! So do I!
Do you have a means of securely and reliably getting it? Cause I don’t.
You really come across as coming from a place of privilege whilst lamenting that the reason poor people are worried about this is because they’re just not as nice as you.
Lmao I have never been rich, in my entire life. It isn’t like my art is being directly copied.
Well, if you can’t beat them, join them. You have to adjust to the pace of what society is moving towards.
What if it’s adjusting towards segregation and fascism? Should we go for that too?
Seriously? This literally has nothing to do with segregation and facism.
doesnt work anyway lol
Ironic that they used an AI picture for the article…
I hope every artist starts using it.
AI art isn’t real art.
Your opinion, not a fact. Most art is as derivative or more than AI art.
How can any human made art be more derivative than ai art that’s impossible
Ai doesn’t create anything, it’s not even real AI yet, it’s just an automated data-scraper. When you tell it to “make” something, it just pulls up bits and pieces that match that description and forms it into a Frankenstein’s monster of what you asked it to make
No, that’s fact. AI-generated images aren’t art. They’re hallucinations without meaning or purpose.
you need to learn the difference between opinion and fact then
What if it’s 50/50 text-to-image and manual brush work?
Then it’s multimedia art
Just because some creator (or in your words hallucinator) did not intend meaning, does not prohibit or somehow prevent any beholder to still derive or instill meaning. Your weird comment is also art. The webpage or app we are viewing it on is art. Remember this?. The definition you use may suit you personally, but words are for communicating with others, and to most others it’s definition will be crucially different than yours. Consider adjusting your view or the words you use.
Stupid opinion. If I ask AI to draw an image, that has no meaning or purpose? So if I did the exact same thing with a pencil then it’s suddenly art? AI is just a tool and people like you need to get over it or fully commit and say anything digital isn’t art because a computer really did it. Anything made in Photoshop can’t be art according to you because a program made it. Blender renders aren’t art because a computer generated it. All you did in either case was tell it what to do.
Do you reply to people this way in person ?
If they’re stupid yes.
ok, well I guess good luck in this world
You too. Maybe stop presenting your opinions as facts and maybe you’ll get a better reply next time.
is anyone else excited to see poisoned AI artwork? This might be the element that makes it weird enough.
Also, re: the guy lol’ing that someone says this is illegal - it might be. is it wrong? absolutely not. does the woefully broad computer fraud and abuse act contain language that this might violate? it depends, the CFAA has two requirements for something to be in violation of it.
-
the act in question affects a government computer, a financial institution’s computer, OR a computer “which is used in or affecting interstate or foreign commerce or communication” (that last one is the biggie because it means that almost 100% of internet activity falls under its auspices)
-
the act “knowingly causes the transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer;” (with ‘protected computer’ being defined in 1)
Quotes are from the law directly, as quoted at https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act
the poisoned artwork is information created with the intent of causing it to be transmitted to computers across state or international borders and damaging those computers. Using this technique to protect what’s yours might be a felony in the US, and because it would be considered intentionally damaging a protected computer by the knowing transmission of information designed to cause damage, you could face up to 10 years in prison for it. Which is fun because the people stealing from you face absolutely no retribution at all for their theft, they don’t even have to give you some of the money they use your art to make, but if you try to stop them you go to prison for a decade.
The CFAA is the same law that Reddit co-founder Aaron Swartz was prosecuted under. His crime was downloading things from JSTOR that he had a right to download as an account holder, but more quickly than they felt he should have. He was charged with 13 felonies and faced 50 years and over a million dollars in fines alongside a lifetime ban from ever using an internet connected computer again when he died by suicide. The charges were then dropped.
It’s not damaging a computer, it’s poisoning the models ai uses to create the images. The program will work just fine, and as expected given the model that it has, the difference is the model might not be accurate. It’s like saying you’re breaking a screen if you’re now looking at a low res version of an image
the models are worth money and are damaged. that’s how the law will see it.
My big thing here is if there’s no contract, where is the onus for having correct models? Yah, the models are worth money, but is it the artist or softwares responsible for those correct models? I’d say most people who understand how software works would say software, unless they were corporate shills. Make better software, or pay the artists, the reaction shouldn’t be “artists are fooling me, they should pay”
Taking it to an extreme. Say somehow they had this same software back in the 90s, could the generative software sue because all the images were in 256 colors? From your perspective, yes, cause it was messing up their models that are built for many more colors
“Damage to a computer” is legal logorrhoea, possible interpretations range from not even crashing a program to STUXNET, completely under-defined so it’s up to the courts to give it meaning. I’m not at all acquainted with US precedent but I very much doubt they’ll put the boundary at the very extreme of the space of interpretation, which “causes a program to expose a bug in itself without further affecting functioning in any way” indeed is.
Which is fun because the people stealing from you face absolutely no retribution at all for their theft,
Learning from an image, studying it, is absolutely not theft. Otherwise I shall sue you for reading this comment of mine.
Damage to a computer” is legal logorrhoea
The model is the thing of value that is damaged.
Learning from an image is not theft
But making works derivative from someone else’s copyrighted image is a violation of their rights.
So any art done in a style of another artist is theft? Of course not. Learning from looking at others is what all of us do. It’s far more complicated than you’re making it sound.
IMO, If the derivative that the model makes is too close to someone else’s, the person distributing such work would be at fault. Not the model itself.
But again, it’s very nuanced. It’ll be interesting to see how it plays out in the courts.
Of course not, but what does this have to do with generative models? Deep learning has as much to do with learning as democratic people’s north Korea does with democracy.
The model is the thing of value that is damaged.
It does not get damaged, it stays as it is. Also it’s a bunch of floats, not a computer.
But making works derivative from someone else’s copyrighted image is a violation of their rights.
“Derivative work” doesn’t mean “inspired by”. For a work to be derivative it needs to include major copyrightable elements of the original work(s). Things such as style aren’t even copyrightable. Character design is, but then you should wonder whether you actually want to enforce that in non-commercial settings like fanart, even commissioned fanart, if e.g. Marvel doesn’t care as long as you’re not making movies or actual comics. They gain nothing from there not being, say, a Deadpool version of the Drake meme.
And this is why we don’t obey the law.
-
I like the idea, but Nightshade and Glaze take some pretty high-end graphics specifications. Sadly, I have a Nvidia GTX 1660 which apparently has issues with Pytorch.😢
Ok, but who is going to reign in military & law enforcement AI tech?
Is there a similar tool that will “poison” my personal tracked data? Like, I know I’m going to be tracked and have a profile built on me by nearly everywhere online. Is there a tool that I can use to muddy that profile so it doesn’t know if I’m a trans Brazilian pet store owner, a Nigerian bowling alley systems engineer, or a Beverly Hills sanitation worker who moonlights as a practice subject for budding proctologists?
The only way to taint your behavioral data so that you don’t get lumped into a targetable cohort is to behave like a manic. As I’ve said in a past comment here, when you fill out forms, pretend your gender, race, and age is fluid. Also, pretend you’re nomadic. Then behave erratic as fuck when shopping online - pay for bibles, butt plugs, taxidermy, and PETA donations.
Your data will be absolute trash. You’ll also be miserable because you’re going to be visiting the Amazon drop off center with gag balls and porcelain Jesus figurines to return every week.
Then behave erratic as fuck when shopping online - pay for bibles, butt plugs, taxidermy, and PETA donations.
…in the same transaction. It all needs to be bought and then shipped together. Not only to fuck with the algorithm, but also to fuck with the delivery guy. Because we usually know what you ordered. Especially when it’s in the soft bag packaging. Might as well make everyone outside your personal circle think you’re a bit psychologically disturbed, just to be safe.
The browser addon “AdNauseum” can help with that, although it’s not a complete solution.
Mbyae try siunlhffg the mldide lterets of ervey wrod? I wnedor waht taht deos to a luaangge medol?
Is there a similar tool that will “poison” my personal tracked data? Like, I know I’m going to be tracked and have a profile built on me by nearly everywhere online. Is there a tool that I can use to muddy that profile so it doesn’t know if I’m a trans Brazilian pet store owner, a Nigerian bowling alley systems engineer, or a Beverly Hills sanitation worker who moonlights as a practice subject for budding proctologists?
Have you considered just being utterly incoherent, and not making sense as a person? That could work.