What to know about Minnesota sues TikTok, alleging it preys on young people with addictive algorithms

6fe88e8d 2f7f 496b 800b 33330aac1765

Minnesota Sues TikTok Over Alleged Exploitation of Young Users

The state of Minnesota has filed a lawsuit against TikTok and its parent company, ByteDance, accusing the social media platform of deploying addictive algorithms that harm young users’ mental health. The complaint, led by Minnesota Attorney General Keith Ellison, alleges that TikTok’s design intentionally exploits psychological vulnerabilities to keep minors engaged for excessive periods, violating state consumer protection laws.

Key Allegations in the Lawsuit

The lawsuit claims TikTok’s algorithm-driven features, such as infinite scrolling and personalized content recommendations, are engineered to create a “cycle of addiction” among young users. Minnesota asserts that the platform:

  • Fails to disclose the addictive nature of its platform to users and parents.
  • Prioritizes profit over safety by maximizing screen time, despite knowing the risks of anxiety, depression, and sleep disorders in minors.
  • Collects data on users under 13 without adequate parental consent, potentially violating the federal Children’s Online Privacy Protection Act (COPPA).

Legal and Regulatory Context

Minnesota’s lawsuit builds on growing scrutiny of social media’s impact on youth mental health. The state cites its Deceptive Trade Practices Act, arguing TikTok’s business practices mislead consumers about safety and data privacy. Attorney General Ellison emphasized that the suit seeks to force TikTok to “end its manipulative tactics” and pay civil penalties for alleged violations.

This case follows similar actions nationwide. In 2023, Utah and Arkansas filed suits against TikTok, while Montana attempted to ban the app outright—a move blocked by a federal judge. At the federal level, lawmakers have proposed bills like the Kids Online Safety Act (KOSA) to limit algorithmic targeting of minors.

TikTok’s Response

TikTok has denied the allegations, stating it invests in safeguards for younger users, including screen-time limits and parental controls. A spokesperson said the company “will vigorously defend” its practices, emphasizing its commitment to “supporting the well-being of our community.”

Implications and Challenges

The lawsuit raises critical questions about accountability for tech companies. Minnesota seeks remedies such as fines, algorithmic transparency, and stricter age verification. However, legal experts note potential hurdles, including arguments that federal law preempts state-level regulation of digital platforms. First Amendment concerns may also arise, as courts have historically protected algorithmic curation as free speech.

Broader Impact on Tech Regulation

Minnesota’s case highlights the escalating pressure on social media firms to address youth safety. A successful outcome could inspire more states to pursue aggressive litigation or legislation, accelerating calls for federal oversight. Conversely, a dismissal might embolden platforms to resist changes to their business models.

As the case unfolds, it underscores the national debate over balancing innovation with protections for vulnerable users—a challenge that will shape the future of digital governance.

Unsplash