Comment on The activist who’s taking on artificial intelligence in the courts: ‘This is the fight of our lives’
Even_Adder@lemmy.dbzer0.com 9 months agoTheir problem was that they smashed too many looms and not enough capitalists. AI training isn’t just for big corporations. We shouldn’t applaud people that put up barriers that will make it prohibitively expensive to for regular people to keep up. This will only help the rich and give corporations control over a public technology.
frog@beehaw.org 9 months ago
It should be prohibitively expensive for anyone to steal from regular people, whether it’s big companies or other regular people. I’m not more enthusiastic about the idea of people stealing from artists to create open source AIs than I am when corporations do it. For an open source AI to be worth the name, it would have to use only open source training data - ie, stuff that is in the public domain or has been specifically had an open source licence assigned to it. If the creator hasn’t said they’re okay with their content being used for AI training, then it’s not valid for use in an open source AI.
Even_Adder@lemmy.dbzer0.com 9 months ago
I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven’t already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.
People are trying to conjour up new rights to take another piece of the public’s right and access to information. To fashion themselves as new owner class. Artists and everyone else should accept that others have the same rights as they do, and they can’t now take those opportunities from other people because it’s their turn now.
There’s already a model trained on just Creative Commons licensed data, but you don’t see them promoting it. That’s because it was not about the data, it’s an attack on their status, and when about generators that didn’t use their art, they came out overwhelmingly against with the same condescending and reductive takes they’ve been using this whole time.
Image
Image
Image
Image
Image
I believe that generative art, warts and all, is a vital new form of art that is shaking things up, challenging preconceptions, and getting people angry - just like art should.
frog@beehaw.org 9 months ago
I’m actually fine with generative AI that uses only public domain and creative commons content. I’m not threatened by AI as a creative, because AI can only iterate on its own training data. Only humans can create something genuinely new and original. My objection is solely on the basis of theft. If we agree that everybody has the basic right to control their own data and content, than that logically has to extend to artists: they must have the right to control their own work, and consenting to humans viewing it isn’t the same as consenting to having it fed into an AI.
I suspect there would be a lot more artists open to considering the benefits of a generative AI using only public domain and creative commons works if they weren’t justifiably aggrieved at having their life’s work strip-mined. Expecting the victims of exploitation to be 100% rational about their exploiter (or other adjacent parties trying to argue why it’s fine when they do it) isn’t reasonable. At this point, artists simply don’t trust the generative AI industry, and there needs to be a significant and concerted effort to rectify existing wrongs to repair that trust. One organisation offering a model based on creative commons artworks, when the rest of the generative AI industry is still stealing everything that’s not nailed down, does not promote trust. Regulate, compensate, mend some fences, and build trust. Then go and talk to artists, and have the conversations that should have been had before the first AI models were built. The AI industry needs to prove it can be trusted, and then learn to ask for permission. Then, maybe, it can ask for forgiveness.
Even_Adder@lemmy.dbzer0.com 9 months ago
I don’t like this kind of thought because it tries to minimize the role of the person at the controls. There is no reason why a person using a model trained on 1400s art, African art, anime, photography, cubism, sculpture, cullinary art, impressionism, nature, and ancient Greco-Roman etc. wouldn’t be able to come up with novel concepts, executions, and styles, since it’s very much a combination of styles that gives rise to new types of art in all other mediums. And that’s before you even start fine-tuning on your own stuff.
It isn’t like a human viewing it, but it is very like other protected uses of data. To quote the article:
This is just a way to analyze and reverse engineer concepts in images so you can make your own original works. Reverse engineering has been fair use since Sega Enterprises Ltd. v. Accolade, Inc in 1992, and then affirmed in Sony Computer Entertainment, Inc. v. Connectix Corporation.
In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. There are just some things you can’t stop people from doing with things you’ve shared with them, and we shouldn’t be trying to change that.
Calling this stealing is self-serving, manipulative rhetoric that unjustly vilifies people and misrepresents the reality of how these models work and how creative the people who use them can be.