Agreed. The problem is that so many (including in this thread) argue that training AI models is no different than training humans—that a human brain inspired by what it sees is functionally the same thing.
My response to why there is still an ethical difference revolves around two arguments: scale, and profession.
Scale: AI models’ sheer image output makes them a threat to artists where other human artists are not. One artist clearly profiting off another’s style can still be inspiration, and even part of the former’s path toward their own style; however, the functional equivalent of ten thousand artists doing the same is something else entirely. The art is produced at a scale that could drown out the original artist’s work, without which such image generation wouldn’t be possible in the first place.
Profession. Those profiting from AI art, which relies on unpaid scraping of artists’s work for data sets, are not themselves artists. They are programmers, engineers, and the CEOs and stakeholders who can even afford the ridiculous capital necessary in the first place to utilize this technology at scale. The idea that this is just a “continuation of the chain of inspiration from which all artists benefit” is nonsense.
As the popular adage goes nowadays, “AI models allow wealth to access skill while forbidding skill to access wealth.”
desktop_user@lemmy.blahaj.zone 2 months ago
I think that, in many ways AI is just worsening the problems of excessive copyright terms. Copyright should last 20 years, maybe 40 if it can be proven that it is actively in use.
EldritchFeminity@lemmy.blahaj.zone 2 months ago
Copyright is its own whole can of worms that could have entire essays just about how it and AI cause problems. But the issue at hand really comes down to one simple question:
Is a man not entitled to the sweat of his brow?
“No!” Says society. “It’s not worth anything.”
“No!” Says the prompter. “It belongs to the people.”
“No!” Says the corporation. “It belongs to me.”
LainTrain@lemmy.dbzer0.com 2 months ago
Does it not belong to the people? The meaning of that saying is a shitty analogy for this. You’re entitled to the sweat of your brow, but not more from a society, and if you use free infrastructure of the commons to share your work, it belongs to the commons
EldritchFeminity@lemmy.blahaj.zone 2 months ago
And what free infrastructure would that be? Social media is privately run, as are websites. Art posted online largely falls under the category of advertising, as artists are advertising their services for commission purposes.
AI bros say that image generators have democratized art. Do you know what actually democratized art? The pencil. The chisel and slate. The idea that taking the effort of other people and using it for your own convenience without giving them proper credit isn’t democracy or fair use. It’s corporate middle management. People simply don’t want to put in the effort to learn a valuable skill, and they don’t want to pay for it either, but they still want the reward for said effort. It’s like expecting your friend to fix your computer for free because they work in IT.
ClamDrinker@lemmy.world 2 months ago
I think you are making the mistake of assuming disagreement with your stance means someone would say no to these questions. Simply put - it’s a strawman.
Most (yes, even corporations, albeit much less so for the larger ones), would say “Yes” to this question on it’s face value, because they would want the same for their own “sweat of the brow”. But certain uses after the work is created no longer have a definitive “Yes” to their answer, which is why your ‘simple question’ is not an accurate representation, as it forms no distinctions between that. You cannot stop your publicly posted work from being analyzed, human or computer. This is firmly established. As others have put in this thread, reducing protections over analysis will be detrimental to both artists as well as everyone else. It would quite literally cause society’s ability to advance to slow down if not halt completely as most research requires analysis of existing data.
Artists have always been undervalued, I will give you that. But to do that, we should provide artists better protections that don’t rely on breaking down other freedoms. For example, UBI. And I wish people that were against AI would focus on that, since that is actually something you could actually get agreement on with most of society.
EldritchFeminity@lemmy.blahaj.zone 2 months ago
It’s not about “analysis” but about for-profit use. Public domain still falls under Fair Use. I think you’re being too optimistic about support for UBI, but I absolutely agree on that point. There are countries that believe UBI will be necessary in a decades time due to more and more of the population becoming permanently unemployed by jobs being replaced. I say myself that I don’t think anybody would really care if their livelihoods weren’t at stake (except for dealing with the people who look down on artists and say that writing prompts makes them just as good as if not better than artists). As it stands, artists are already forming their own walled off communities to isolate their work from being publicly available and creating software to poison LLMs. So either art becomes largely inaccessible to the public, or some form of horrible copyright action is taken because those are the only options available to artists.
Ultimately, I’d like a licensing system put in place, like for open source software where people can license their works and companies have to cite their sources for their training data. Academics have to cite their sources for research, and holding for-profit companies to the same standards seems like it would be a step in the right direction. Simply require your data scraper to keep track of where it got its data from in a publicly available list. That way, if they’ve used stuff that they legally shouldn’t, it can be proven.