Friday, March 20, 2026

TikTok Blackout Challenge: Child Deaths, Lawsuits, and Algorithm Danger

Must read

A nine-year-old girl is dead. So are dozens of other children. And a social media platform’s algorithm may have helped put the challenge in front of them.

The so-called blackout challenge — a dangerous dare that instructs participants to choke themselves until they lose consciousness — has claimed the lives of at least 20 children in recent years, with lawsuits now targeting TikTok directly over allegations that its recommendation algorithm actively pushed the deadly trend to minors. The cases span two continents and multiple years, and they’re forcing courts, parents, and policymakers to confront an uncomfortable question: when does a platform’s design become complicity?

A Family Torn Apart in Stephenville

JackLynn Blackwell was nine years old and growing up in Stephenville, Texas, when she died in her own backyard after wrapping a cord around her neck. Her father, Curtis Blackwell, has struggled to describe a life that no longer includes her. Recalling the tight-knit bond they shared, he said simply, “It was just the three of us, three amigos, we did everything together.” She became one of roughly 80 documented cases tracked by the Centers for Disease Control and Prevention — a number that spans deaths recorded as far back as 1995.

She wasn’t alone. In Aurora, Colorado, Joshua Haileyesus, just 12 years old, died after choking himself with a shoelace. His death became one of the central cases in a wave of litigation against TikTok. “TikTok is just not a safe place for kids,” said an attorney involved in a campaign pushing for stronger parental controls on the platform. It’s a blunt assessment. But given the body count, it’s hard to argue with.

Across the Atlantic, the Same Grief

This isn’t only an American story. Four British families have filed suit in U.S. courts, alleging TikTok’s platform was directly responsible for the 2022 deaths of their children. Isaac Kenevan, 13. Archie Battersbee, 12. Julian “Jools” Sweeney, 14. Maia Walsh, 13. All gone. All, their families contend, funneled toward the same lethal content by the same recommendation engine.

The suits don’t just allege negligence. They allege something more specific and, frankly, more damning — that TikTok’s algorithm promoted the blackout challenge to children under 13, a demographic the platform technically prohibits from using the app at all. That’s the catch, isn’t it? The age restrictions exist on paper. The algorithm, apparently, didn’t get the memo.

The Legal Battle and Its Roadblocks

Courts have not been uniformly receptive. Several earlier lawsuits were dismissed on immunity grounds under Section 230 of the Communications Decency Act, the federal statute that has long shielded platforms from liability for third-party content. It’s a legal wall that plaintiffs’ attorneys have been trying to chip away at for years — arguing that algorithmic amplification isn’t passive hosting, it’s active promotion, and that distinction should matter.

Still, the legal fight continues. In January 2026, a Delaware court held a hearing on TikTok’s motion to dismiss a suit brought by five British families, including Ellen Roome and Lisa Kenevan, whose children were all under 13 when they died. The outcome of that motion could set a significant precedent — one that either cracks open platform liability or, once again, slams the door on grieving parents seeking accountability from one of the world’s most powerful tech companies.

How Did It Get This Far?

How does a challenge this obviously lethal keep circulating? The mechanics aren’t complicated. The blackout challenge involves cutting off oxygen to the brain until the participant passes out — a stunt that can cause permanent neurological damage or death within minutes, often without any warning signs. It predates TikTok by decades; the CDC’s documented cases stretch back to 1995. But the platform’s reach and its algorithmic appetite for engagement accelerated its spread in ways that earlier generations of the internet simply couldn’t.

Women’s Health documented the resurgence of the trend in 2021, noting at least 20 recent child deaths tied to the challenge. That’s not a blip. That’s a pattern. And patterns, in journalism and in law, tend to demand explanations.

What Comes Next

TikTok has maintained that it removes dangerous content and has invested in safety tools. The company has pointed to age verification efforts and content moderation as evidence of good faith. Critics — and plaintiffs’ attorneys — say those measures are cosmetic at best, and that a platform generating billions of dollars in revenue can and should do more. A hearing in Delaware early this year suggests the courts aren’t entirely done asking those questions either.

For Curtis Blackwell and the other parents who’ve buried their children, the legal arguments are almost beside the point. They want someone to be held responsible. They want the algorithm that served a death dare to a nine-year-old to be treated as something more than a neutral technical process. Whether the law will agree with them remains, for now, an open question — but it’s one the courts are no longer able to avoid.

Three amigos became two. And somewhere in a server farm, an algorithm is still learning what to show a child next.

- Advertisement -

More articles

- Advertisement -spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article