Comment on Meta pirated and seeded porn for years to train AI, lawsuit says

<- View Parent
WalnutLum@lemmy.ml ⁨1⁩ ⁨week⁩ ago

Well, Stable Diffusion 3 supposedly purposefully removed all porn from their training and negatively trained the model on porn and it apparently destroyed the model’s ability to generate proper anatomy.

Regardless, image generation models need some porn in training to at least know what porn is so that they know what porn is not.

It’s part of a process called regularization, or preventing any particular computational model from over-fitting.

source
Sort:hotnewtop