TikTok sued by 14 states over alleged harm to children’s mental health
Clare Duffy, CNN | 10/8/2024, 10:53 a.m.
A bipartisan group of 14 attorneys general from across the country filed lawsuits on Tuesday against TikTok, alleging that the platform has “addicted” young people and harmed their mental health.
The lawsuits take issue with various elements of the TikTok platform, including its endlessly scrolling feed of content, TikTok “challenge” videos that sometimes encourage users to engage in risky behavior and late-night push notifications that the attorneys general claim can disrupt kids’ sleep.
The lawsuits were each filed separately by members of the coalition, co-led by New York Attorney General Letitia James and California Attorney General Rob Bonta. And they mark just the latest legal pressure facing TikTok, which is also battling a law that could see it banned in the United States as soon as next year, a lawsuit from the US Justice Department alleging the platform unlawfully collected children’s data and several state actions.
In June, New York’s governor also signed into law a bill to regulate social media algorithms; for example, it will require platforms to display content in chronological order to users under the age of 18, which could force TikTok to overhaul how it operates. And last month, 42 state attorneys general called on US Surgeon General Vivek Murthy to require labels on social media apps warning of their potential harm to young users.
“We strongly disagree with these claims, many of which we believe to be inaccurate and misleading,” TikTok spokesperson Alex Haurek said in a statement. “We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product. We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features.”
TikTok, for its part, has repeatedly said it believes its platform is safe for children and that it offers safety features such as default screentime limits for young users and optional parental oversight tools.
Haurek added that TikTok has “endeavored to work with the Attorneys General for over two years, and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges.”
However, the group of states involved in Tuesday’s action claim TikTok is not doing enough.
“TikTok’s underlying business model focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads,” the attorneys general said in a statement. “TikTok uses an addictive, content-recommendation system designed to keep minors on the platform as long as possible and as often as possible, despite the dangers of compulsive use.”
The complaint filed by James alleges that TikTok “knows that compulsive use of and other harmful effects of its platform are wreaking havoc on the mental health of millions of American children and teenagers.” It also states that, “TikTok considers users under the age of 13 to be a critical demographic,” despite saying it allows only users 13 and older on the platform. The complaint references internal TikTok documents, although it is heavily redacted.
James alleges that the platform’s focus on “profits over safety has make TikTok extremely profitable,” noting that TikTok’s 2023 US revenue reached $16 billion, according to the complaint. The complaint also cites a Harvard study that claimed TikTok earned $2 billion in ad revenue in 2022 from US teens aged 13 to 17.
TikTok’s so-called beauty filters – which manipulate users’ images, often by making them appear thinner or as if they are wearing makeup – can “encourage unhealthy, negative social comparison, body image issues, and related mental and physical health disorders” by creating “an impossible standard” for teens, the complaint alleges.
It also alleges that TikTok “challenges,” viral trends where users try to replicate videos created by others, can encourage dangerous behavior among young users. Earlier this year, a teen boy died in Brooklyn while riding on the outside of a subway train, a stunt known as “subway surfing,” and his mother later “found videos promoting subway surfing in a challenge on his TikTok account,” the complaint states. TikTok previously cooperated with New York authorities to remove subway surfing content, the New York Times reported in January.
James’ complaint also accuses TikTok of violating the US Children’s Online Privacy Protection Act (known as “COPPA”) by failing to prevent children under the age of 13 from joining the app and collecting their personal information without parental consent. It states that TikTok claims the platform is not for children under the age of 13, but that it “features child-directed subject matter, characters, activities, music, and other content as well as advertisements directed to children.”
“By maximizing the TikTok platform’s addictive properties, TikTok has cultivated a generation of young users who spend hours per day on its platform—more than they would otherwise choose to—which is highly detrimental to teens’ development and ability to attend to personal needs and responsibilities,” James’ complaint states.
The lawsuit seeks financial penalties against TikTok, including a requirement that the platform repay any profits it received from ads directed to New York teens or pre-teens.
CNN’s Matt Egan contributed to this report.