Comment on AI trained on photos from kids’ entire childhood without their consent
Even_Adder@lemmy.dbzer0.com 5 months agoSo the question that comes to mind is exactly how, on a practical level, it would work to make sure that when a company scrapes data, trains and AI, and then makes billions of dollars, the thousands or millions of people who created the data all get a cut after the fact. Because particularly in the creative sector, a lot of people are freelancers who don’t have a specific employer they can go after. From a purely practical perspective, paying artists before the data is used makes sure all those freelancers get paid. Waiting until the company makes a profit, taxing it out of them, and then distributing it to artists doesn’t seem practical to me.
This isn’t labor law.
frog@beehaw.org 5 months ago
Labour law alone, in terms of the terms under which people are employed and how they are paid, does not protect freelancers from the scenario that you, and so many others, advocate for: a multitude of individuals all training their own AIs. No AI advocate has ever proposed a viable and practical solution to the large number of artists who aren’t directly employed by a company but are still exposed to all the downsides of unregulated AI.
Even_Adder@lemmy.dbzer0.com 5 months ago
I don’t think they have to, the point is to fight against regression of public rights for the benefit of the few.
frog@beehaw.org 5 months ago
Destroying the rights of artists to the benefit of AI owners doesn’t achieve that goal. Outside of the extremely wealthy who can produce art for art’s sake, art is a form of skilled labour that is a livelihood for a great many people, particularly the forms of art that are most at risk from AI - graphic design, illustration, concept art, etc. Most of the people in these roles are freelancers who aren’t in salaried jobs that can be regulated with labour laws. They are typically commissioned to produce specific pieces of art. I really don’t think AI enthusiasts have any idea how rare stable, long-term jobs in art actually are. The vast majority of artists are freelancers: it’s essentially a gig-economy.
Changes to labour laws protect artists who are employees - which we absolutely should do, so that companies can’t simply employ artists, train AI on their work, then fire them all. That absolutely needs to happen. But that doesn’t protect freelancers from companies that say “we’ll buy a few pieces from that artist, then train an AI on their work so we never have to commission them again”. It is incredibly complex to redefine commissions as waged employment in such a way that the company can both use the work for AI training while the artist is ensured future employment. And then there’s the issue of the companies that say “we’ll just download their portfolio, then train an AI on the portfolio so we never have to pay them anything”. All of the AI companies in existence fall into this category at present - they are making billions on the backs of labour they have never paid for, and have no intention of ever paying for. There seems to be no rush to say that they were actually employing those millions of artists, who are now owed back-pay for years worth of labour and all the other rights that workers protected by labour laws should have.
Even_Adder@lemmy.dbzer0.com 5 months ago
I’m not fighting for the extremely wealthy, I’m fighting for the existence of competitive open source models. Something that can’t happen with what you’ve proposed. That would just hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people to keep up with Megacorporations that already own vast troves of data and can afford to buy even more.
This article by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries does a good job of explaining what I’m talking about.