As lawsuits continue piling up against social media platforms for allegedly causing harms to children, a Pennsylvania court has ruled that TikTok is not liable in one case where a 10-year-old named Nylah Anderson died after attempting to complete a “Blackout Challenge” she discovered on her “For You” page.
The challenge recommends that users choke themselves until they pass out, and Nylah’s mother, Tawainna Anderson, initially claimed that TikTok’s defective algorithm was responsible for knowingly feeding the deadly video to her child. The mother hoped that Section 230 protections under the Communications Decency Act—which grant social platforms immunity for content published by third parties—would not apply in the case, but ultimately, the judge found that TikTok was immune.
TikTok’s “algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it,” Judge Paul Diamond wrote in a memorandum before issuing his order. “In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.”
This is not the only lawsuit attempting to hold TikTok liable for the deaths of children from the “Blackout Challenge.” Other lawsuits filed this summer in California are still pending, but these make similar arguments regarding TikTok’s allegedly defective algorithm. Diamond suggested that Nylah’s mother “cannot defeat Section 230 immunity” simply “by creatively labeling her claims.” His judgment suggests that those other pending lawsuits won’t fare any better in overcoming the effective shield that Section 230 grants social media companies as publishers, no matter the outcome of how algorithms are designed to recommend content.