Comment on lemm.ee plans for mitigating image upload abuse
rarely@sh.itjust.works 1 year agoThis is an interesting idea. So if I’m understanding you correctly the workflow would be like this:
-
user uploads 4 images… 2 are flagged as CSAM.
-
user overwrites the flag on one image, asserting that “no, this isn’t CSAM”
-
in other sites, I’ve seen this work by the content remaining hidden except for the user until a team reviews it. If the team agrees, it’s allowed on the site. I think this is different from what you are describing though. I think you’re suggesting that the content stay online after the user overwrites the flagging, but then a mod will later double-check to see if the user was indeed trustworthy.
I only worry that an untrustworthy user will keep the content online until a mod reviews it, increasing the time the material is online and increasing the risk. It would be difficult to argue that “this was done in the interest of user satisfaction, even though it means that more CSAM got out”. Ultimately I don’t know how many people want to argue that to a judge.
lily33@lemm.ee 1 year ago
From the OP, it seems the filters don’t flag CSAM. They flag any NSFW. That said, keep in mind that the filter would also have false negatives, so if people want to slip NSFW though, some can end up there even without such option.
But I don’t mind the content staying hidden until a mod reviewed is in such cases. The false positive rate of the filter would likely be small, so there wouldn’t be too many things that need review.