“These were mostly family photos uploaded to personal and parenting blogs […] as well as stills from YouTube videos"
So… people posted photos of their kids on public websites, common crawl scraped them, LAION-5B cleaned it up for training, and now there are models. This doesn’t seem evil to me… digital commons working as intended.
If anyone is surprised, the fault lies with the UX around “private URL” sharing, not devs using Common Crawl
Fapper_McFapper@lemm.ee 5 months ago
criitz@reddthat.com 5 months ago
Too bad, it’s here forever…
remotelove@lemmy.ca 5 months ago
It’s been around for a while. It’s the fluff and the parlor tricks that need to die. AI has never been magic and it’s still a long way off before it’s actually intelligent.
DdCno1@beehaw.org 5 months ago
It could be regulated into oblivion, to the point that any commercial use of it (and even non-commercial publication of AI generated material) becomes a massive legal liability, despite the fact that AI tools like Stable Diffusion can not be taken away. It’s not entirely unlikely that some countries will try to do this in the future, especially places with strong privacy and IP laws as well as equally strong laws protecting workers. Germany and France come to mind, which together could push the EU to come down hard on large AI services in particular. This could make the recently adopted EU AI Act look harmless by comparison.
Catsrules@lemmy.ml 5 months ago
AI will remember that.