It's certainly technically possible. I suspect these AI models just aren't good at it. So the pedophiles need to train them on actual images.
I can imagine for example AI doesn't know what puberty is since it has in fact not seen a lot of naked children. It would try to infer from all the internet porn it's seen, and draw any female with big breasts, disregarding age. And that's not how children actually look.
I haven't tried, since it's illegal where I live. But that's my suspicion why pedophiles bother with training models.
rikudou@lemmings.world 1 day ago
How do they know that? Did the pedos text them to let them know? Sounds very made up.
ExtremeDullard@lemmy.sdf.org 1 day ago
The article says “remixed” images of old victims have cropped up.
rikudou@lemmings.world 1 day ago
And again, what’s the source? The great thing with articles about CSAM is that you don’t need sources, everyone just assumes you have them, but obviously cannot share.
Did at least one pedo try that? Most likely yes. Is it the best way to get good quality fake CSAM? Not at all.
ExtremeDullard@lemmy.sdf.org 1 day ago
I don’t know man. But I assume associations concerned with child abuse are all over that shit and checking it out. I’m not a specialist of CSAM but I assume an article that says old victims show up in previously-unseen images doesn’t lie, because why would it? It’s not like Wired is a pedo outlet…
Also, it was just a question. I’m not trying to convince you of anything 🙂