it’s by definition not CSAM if it’s AI generated
Tell that to the judge. People caught with machine-made imagery go to the slammer just as much as those caught with the real McCoy.
Comment on If AI spits out stuff it's been trained on
Free_Opinions@feddit.uk 1 day ago
First of all, it’s by definition not CSAM if it’s AI generated. It’s simulated CSAM - no people were harmed doing it. That happened when the training data was created.
However it’s not necessary that such content even exists in the training data. Just like ChatGPT can generate sentences it has never seen before, image generators can also generate pictures it has not seen before. Ofcourse the results will be more accurate if that’s what it has been trained on but it’s not strictly necessary. It just takes a skilled person to write the prompt.
My understanding is that the simulated CSAM content you’re talking about has been made by people running their software locally and having provided the training data themselves.
it’s by definition not CSAM if it’s AI generated
Tell that to the judge. People caught with machine-made imagery go to the slammer just as much as those caught with the real McCoy.
Have there been cases like that already?
It’s not legal advice I’m giving here.
Buffalox@lemmy.world 1 day ago
This is blatantly false. It’s also illegal and you can go to prison for owning selling or making child Lolita dolls.
frightful_hobgoblin@lemmy.ml 1 day ago
Dumb internet argument from here on down; advise the reader to do something else with their time.
Free_Opinions@feddit.uk 1 day ago
What’s blatantly false about what I said?
Buffalox@lemmy.world 1 day ago
CSAM = Child sexual abuse material
Even virtual material is still legally considered CSAM in most places. Although no children were hurt, it’s a depiction of it. And that’s enough.
Free_Opinions@feddit.uk 1 day ago
Being legally considered CSAM and actually being CSAM are two different things. I stand behind what I said which wasn’t legal advise. By definition it’s not abuse material because nobody has been abused.