Comment on If AI spits out stuff it's been trained on
Buffalox@lemmy.world 1 day agoCSAM = Child sexual abuse material
Even virtual material is still legally considered CSAM in most places. Although no children were hurt, it’s a depiction of it. And that’s enough.
Free_Opinions@feddit.uk 1 day ago
Being legally considered CSAM and actually being CSAM are two different things. I stand behind what I said which wasn’t legal advise. By definition it’s not abuse material because nobody has been abused.
Buffalox@lemmy.world 1 day ago
There’s a reason it’s legally considered CSAM. as I explained it is material that depicts it.
You can’t make up your own definitions.
Free_Opinions@feddit.uk 1 day ago
I already told you that I’m not speaking from legal point of view. CSAM means a specific thing and AI generated content doesn’t fit under this definition. The only way to generate CSAM is by abusing children and taking pictures/videis of it. AI content doesn’t count any more than stick figure drawings do. The justice system may not differentiate the two but that is not what I’m talking about.
Buffalox@lemmy.world 1 day ago
Society has decided otherwise, as I wrote, you can’t have your own facts. You might as well claim that in traffic red means go, because you have your own interpretation of how traffic lights should work.
frightful_hobgoblin@lemmy.ml 1 day ago
Which law are you speaking about?