This is gonna be an interesting fight…
RichardAragon/NightshadeAntidote: An 'antidote' to the recently released AI poison pill project known as Nightshade.
Submitted 9 months ago by Even_Adder@lemmy.dbzer0.com to stable_diffusion@lemmy.dbzer0.com
https://github.com/RichardAragon/NightshadeAntidote
Comments
stepanzak@iusearchlinux.fyi 9 months ago
tagginator@utter.online [bot] 9 months ago
New Lemmy Post: RichardAragon/NightshadeAntidote: An 'antidote' to the recently released AI poison pill project known as Nightshade. (https://lemmy.dbzer0.com/post/12786443)
Tagging: #StableDiffusion(Replying in the OP of this thread (NOT THIS BOT!) will appear as a comment in the lemmy discussion.)
I am a FOSS bot. Check my README: https://github.com/db0/lemmy-tagginator/blob/main/README.md
tyler@programming.dev 9 months ago
Why would someone make something like this? Geez, people really love building unethical stuff don’t they?
elliot_crane@lemmy.world 9 months ago
The tagline is really poorly written IMO. From reading the README, this doesn’t outwardly appear to be a tool for bypassing an artist’s choice to use something like Nightshade, but rather it seems to detect if such a tool has been used.
I’m assuming that the use case would be to avoid training on Nightshade-ed images, which would actually be respecting the original artist’s decision?
tyler@programming.dev 9 months ago
I read the whole thing. I understand it’s for detecting use of nightshade, not bypassing it. What other even slightly ethical use for this is there besides trying to make sure you don’t train on a poisoned image? These models are clearly not asking for permission first, else you’d never need to do this, so they’re just taking an image, assuming they’re allowed to use it, and then using this tool to detect if it’s going to poison their model.
Even_Adder@lemmy.dbzer0.com 9 months ago
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
particularly:
and
More importantly, Nightshade is anti-open source. Since the only models with open VAEs are Stable Diffusion’s open models, companies like Midjourney and OpenAI with closed source models you can’t poke around in can’t be like attacked with this tool. That’s not really something that should be celebrated.
Nightshade is also made Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme. He took GPLv3 code, which is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of the terms of the GPLv3 license.
tyler@programming.dev 9 months ago
And I have to say, it’s pretty telling that you saw my comment and took “unethical” for “illegal”. Your focus is clearly “this isn’t illegal, and here’s the evidence to support it”, rather than introspecting and seeing that legality isn’t tied to ethics in a lot of cases. Instead try looking at it from an ethics standpoint, you’ll find there’s a lot less to stand on supporting how models are created, of course trying to get every artist’s permission for using their images in a model would be incredibly difficult, so you instead support the “it’s not illegal” route, even though it’s clearly unethical.
tyler@programming.dev 9 months ago
I never once mentioned legality. I mentioned ethicality. Clearly you are talking one while I mean the other. It doesn’t really matter if you are technically within the confines of the law here, this tool is clearly meant to bypass authors intent to steal image data, no matter the source. If an author has a clearly posted notice stating that you cannot use their images in a model, there would be no need for this tool, as you wouldn’t bother using those images in the model. But since these image models are built off of massive data sources that were obtained by scraping without even bothering to ask for permission, then you have people building tools to make sure that that can continue.
This is unethical. It does not matter what the law says, you are ignoring what an author might have indicated their rights to an image are and instead trying to use the law to bypass the ethicality and use those ill obtained images to train something that will eventually replace the author.
And bringing up the creator of nightshade here once again does not matter, this is a discussion about the ethicality of the tool you posted, not about the legality of others actions.