Kids have been doing idiotic shit to themselves since the dawn of time. Tik tok or youtube didn’t cause this.
It’s not about who caused it, it’s about responsibility. The responsibility for making it easy to spread, amplifying the message. Kids in your class is very different from millions of viewers. Even in grade school there’s a chance an adult might see it and stop it from happening or educating the children.
Ultimately this is an issue of public health and of education. For such a huge company, a $10m fine is practically nothing, especially when they could train their own algorithm to not surface content like this. Or they could have moderation which removes potentially harmful content. Why are you going to bat for a huge company to not have responsibility for content which caused real harm?
ColeSloth@discuss.tchncs.de 2 weeks ago
Right. And how are you supposed to train an algorithm to filter out any stupid thing a kid might try that’s dangerous? The possibilities are endless. Maybe the parents shouldn’t let their 13 year olds have unrestricted phones and access to tik tok.
LukeZaz@beehaw.org 2 weeks ago
Why are you singling out one small part of their comment to the exclusion of the rest?
ColeSloth@discuss.tchncs.de 2 weeks ago
I didn’t want to type out paragraphs worth talking to a brick wall.
It’s not the internet job to safeguard your kids. That’s the bottom line. All of this regulation and moderation is just stepping stones further to a controlled and moderated internet. Y’all just want to slowly add more and more limitations and training wheels to life and you’re giving up our own freedoms and rights to do it.
Tell me, who decides where the line is drown between allowable and not allowed? How are millions of hours of content supposed to be moderated by decency police to make that decision? How well do you think something automated can be that would do it?
The fine isn’t the point. Yeah, ten million is nothing to a large company. But what it really does is create censorship “for the children”.