Comment on If AI spits out stuff it's been trained on

<- View Parent
Buffalox@lemmy.world ⁨1⁩ ⁨day⁩ ago

CSAM = Child sexual abuse material
Even virtual material is still legally considered CSAM in most places. Although no children were hurt, it’s a depiction of it. And that’s enough.

source
Sort:hotnewtop