Instagram users have told the BBC of the “extreme stress” of having their accounts banned after being wrongly accused by the platform of breaching its rules on child sexual exploitation.
The BBC has been in touch with three people who were told by parent company Meta that their accounts were being permanently disabled, only to have them reinstated shortly after their cases were highlighted to journalists.
I’m so glad I stopped using that garbage.
SweetCitrusBuzz@beehaw.org 3 weeks ago
Whilst this is terrible it’s a good reminder that companies that use automated bots, algorithms, and underpaid and exploited workers for moderation will never be good for social media platforms.
Commercial and social media should never go together in the same sentence if you want it to be good and truly for the people who use it.
Randomgal@lemmy.ca 3 weeks ago
So how would you moderate? Assuming you re paying fair wages
SweetCitrusBuzz@beehaw.org 3 weeks ago
I’ve been a moderator on many many things and this is what I’ve learned:
Firstly, make sure we were a team with tonnes of support from the get go, both each other and mental health services etc as a bare minimum.
Secondly, understand if people need to take breaks as it can take a real toll
Thirdly nobody on the mod team at all should talk down to or criticise other’s work, especially not in a group or public setting. Ask them how they’re doing, ask them what support or information they need etc and then give it to them.
Fourthly explain what is important and why it’s important in a way each person can understand. Have detailed documents also explaining this.
Fifthly have for the users of the service etc easy to understand, non legalise rules
Sixthly if someone violates the rules then you either talk to them personally, if it’s harmful then delete it and if it’s really harmful (such as CSAM) then you remove them and don’t allow them back on no matter what.
That’s not an exhaustive list but what I’ve learned by both being a moderator on many many things and watching others moderate.