TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”
TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.
t3rmit3@beehaw.org 3 months ago
I am generally very skeptical of lawsuits making social media and other Internet companies liable for their users’ content, because that’s usually a route to censor whatever the government deems “harmful”, but I think this case actually makes perfect sense by attacking the algorithmic “curation” that they do. Imo social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.
chahk@beehaw.org 3 months ago
But then how would they make money if they can’t keep users doomscrolling forever to keep serving them ads? Won’t someone think of the shareholders?!
technocrit@lemmy.dbzer0.com 3 months ago
Unfortunately nobody can stop me from doomscrolling.
Kolanaki@yiffit.net 3 months ago
As if that would at all stop these dumbass challenges from being posted and copied? People have been hurting themselves copying something they saw someone else doing even before the invention of the camera.
t3rmit3@beehaw.org 3 months ago
Yes, but that is not the entirety of even majority of the problem with algorithmic feed curation by corporations. Reducing visibility of those dumb challenges is one of many benefits.
schnurrito@discuss.tchncs.de 3 months ago
No it wouldn’t, but people would only see them if they were part of a preexisting community where such things are posted or they specifically looked for them.
On the Internet, censorship happens by having too much information for our limited time and attention span, so going after recommendation algorithms will work.