What’s blatantly false about what I said?
Comment on If AI spits out stuff it's been trained on
Buffalox@lemmy.world 1 day agoFirst of all, it’s by definition not CSAM if it’s AI generated. It’s simulated CSAM
This is blatantly false. It’s also illegal and you can go to prison for owning selling or making child Lolita dolls.
Free_Opinions@feddit.uk 1 day ago
Buffalox@lemmy.world 1 day ago
CSAM = Child sexual abuse material
Even virtual material is still legally considered CSAM in most places. Although no children were hurt, it’s a depiction of it. And that’s enough.Free_Opinions@feddit.uk 1 day ago
Being legally considered CSAM and actually being CSAM are two different things. I stand behind what I said which wasn’t legal advise. By definition it’s not abuse material because nobody has been abused.
Buffalox@lemmy.world 1 day ago
There’s a reason it’s legally considered CSAM. as I explained it is material that depicts it.
You can’t make up your own definitions.
frightful_hobgoblin@lemmy.ml 1 day ago
Dumb internet argument from here on down; advise the reader to do something else with their time.