We’ve all seen our fair share of viral videos, but some of these social media trends can be downright dangerous, even when they seem harmless at first glance. Look at planking, a meme that resulted in an endless stream of funny photos of people laying face-down in odd place – it was even a joke on The Office. It was all laughs until a 20-year-old died trying to plank on a balcony. And today’s social media challenges can be even more dangerous, such as the blackout challenge, which encourages people to choke themselves until they pass out. Social media can make things like this look like a game, and the CDC attributes 82 deaths to this “choking game” amongst children and teens.
The blackout challenge isn’t new, but a recent lawsuit means that social media companies could be held accountable for serving harmful content like this — particularly to children. TikTok is currently in the hot seat for recommending blackout challenge videos to Nylah Anderson, a 10-year-old who mimicked what she saw online and died in the hospital five days later. It’s a horrific incident, but it’s not an isolated one: at least seven parents have sued TikTok over deaths related to these videos.
It’s very difficult to hold social media companies accountable for the content they host due to Section 230, a law that says companies aren’t liable for what users post on their platforms. This has protected Big Tech from liability even for extremely dangerous content, such as the Islamic State recruiting for terrorist attacks over Facebook or human trafficking perpetuated over Craigslist. Tech companies have made some moves to protect users of their platforms, but they're often weak at best. For example, TikTok says it banned searches for #BlackoutChallenge to keep its platform safe, but that didn’t prevent Nylah from seeing videos about it.
Read more: Worried About Your Teen’s TikTok Use? Here’s What You Can Do
But it’s possible that Big Tech’s liability shield is cracking, and Nylah’s case is on the front lines of the fight. That’s because she didn’t search for these videos: TikTok recommended them to her. Her lawyers argued that this recommendation, driven by TikTok’s algorithms on what users might want to watch, is speech on behalf of the platform and not just content the platform is hosting. A federal appeals court has just ruled that TikTok is not protected by Section 230, and legal proceedings will continue. This could finally make the social media company – and others – own up to the harm it can cause by hosting dangerous content.
TikTok isn’t the only one facing trouble. Late last year a judge refused to dismiss a lawsuit against Meta, TikTok, Snap and Google that accused the companies of intentionally designing products to addict and fuel mental health disorders in teenagers.The judge ruled that Section 230 didn’t apply because it wasn’t a matter of content the companies were hosting, but how the platforms were designed and what they recommend to users. Reddit and YouTube are also in legal trouble they can’t sidestep via Section 230, as the companies fight a lawsuit alleging their “defective product” recommended radical content to the perpetrator of the 2022 Buffalo supermarket shooting, and then helped him prepare for the attack.
Read more: Making Snapchat Safer for Your Teen: A Guide to Parental Controls
These are signs that Section 230, which has kept Big Tech safe for thirty years, may not be the get-out-of-jail-free card it has been – particularly as these companies increasingly use AI and algorithms to recommend or write content. Even big businesses are recognizing that they’re on shaky legal ground, and have started to support increased legal regulation for social media and particularly for protecting kids online.
But the results of legislation may still be a long time coming. KOSA, the Kids Online Safety Act, passed in the Senate with bipartisan support, but hasn’t been taken up by the House. And even though the bill aims to protect kids by requiring online platforms to “prevent and mitigate” content that could harm children, critics worry the bill could be misused to censor content. There hasn’t been movement on the bill since it passed this summer.
Lawsuits may force Big Tech to take action sooner. Similar lawsuits against the tobacco and opioid industries resulted in huge payouts and corporate bankruptcies. We may be seeing the very start of a similar crash as the consequences of harmful online content finally reach Big Tech – and there's hope it could drive change across the industry to keep kids safe online.
[Image credit: child holding a phone with the TikTok logo via sergei_elagin/BigStockPhoto]
Elizabeth Harper is a writer and editor with more than a decade of experience covering consumer technology and entertainment. In addition to writing for Techlicious, she's Editorial Director of Blizzard Watch and is published on sites all over the web, including Time, CBS, Engadget, The Daily Dot and DealNews.