New Hampshire's Platform Design Suit Against TikTok

On June 25, 2024, New Hampshire’s Attorney General John M. Formella filed a 124-page complaint against TikTok alleging violations of the state’s Consumer Protection Act (CPA) and other laws. The State’s suit is an effort to address the addictive design features employed by social media platforms. The complaint alleges predatory and deceptive practices that extract and sell user data and encourage children to spend more time on the app, ultimately leading to exploitation and destructive behavior. The complaint frames TikTok’s platform as manipulative and characterizes children’s relationship with the app as a public health crisis. The State asserts that, “[T]he compulsive and prolonged use of TikTok increases the chances of experiencing poor mental health outcomes, including symptoms of depression and anxiety and experiencing lower life satisfaction.”

The complaint outlines TikTok’s business model and key design elements of concern. Like other social media platforms, TikTok’s algorithm leverages user data to personalize feeds and to make content recommendations. TikTok also uses an “infinite scroll” feature and attention-grabbing push notifications that hold young users captive. TikTok has established a virtual currency, “gamifying” its platform design and exploiting children by allowing them to fall prey to “giftbaiting,” a practice where users solicit TikTok coins and gifts during livestreams. 

On July 8, Judge John C. Kissinger, Jr. denied TikTok’s motion to dismiss (mostly), a boon for the State. In a well-crafted order, Judge Kissinger waded through TikTok’s defenses, a few of which are outlined below.

Jurisdiction

First, despite TikTok’s efforts to claim that New Hampshire lacked personal jurisdiction over the company, the order clarified that the harvesting of personal data and the sale of such data in New Hampshire is indicative of the specific connections required for a claim of personal jurisdiction to stand. The order distinguished these connections from other situations where engagement with users was more broad and not specific to a certain state.  

Section 230 Immunity

TikTok raised Section 230 of the Communications Decency Act as an affirmative defense, the law that provides social media companies with immunity from claims that arise from the substance of third-party content posted on their platforms. The court considered whether Section 230 provides immunity to a social media company for claims of negligent product design features that allegedly encourage dangerous behavior? The court reasoned that TikTok’s “duty to design a reasonably safe product is independent from its role as a publisher of third-party content.” 

First Amendment Protections

TikTok also sought First Amendment protection, which like Section 230, safeguards publishers’ discretion to disseminate third-party speech and provides a categorical shield from liability for injuries that arise from that speech. Similar to the court’s findings around Section 230 protection, the court found that the First Amendment does not bar the State’s claims given that “the thrust of the State’s claims seeks to hold [TikTok] accountable for the harm caused by the alleged addictive design features… regardless of the substance or organization of the third-party content disseminated.”

Federal Preemption 

The court's analysis of violations under the State’s CPA and the Federal Trade Commission Act underscores TikTok’s alleged “rascality” and coerciveness baked into its platform design: there is no doubt that users voluntarily download and sign up to use the app, but once they are on the app “they are coerced to continue using the app by the addictive design features that prey on minor users’ psychological vulnerabilities.”

What’s Next?

This order is a big win for New Hampshire and a bright spot for other State Attorneys General looking to hold social media companies accountable for their platform design decisions. 

Previous
Previous

The European commission’s general-purpose AI Code of Practice: preparing for ai act compliance