Minnesota Sues TikTok Over Allegations of Exploiting Young Users
The state of Minnesota has filed a lawsuit against TikTok, accusing the social media platform of deploying addictive algorithms that harm young users’ mental health. The complaint, led by Minnesota Attorney General Keith Ellison, alleges that TikTok’s design intentionally preys on minors by keeping them engaged for excessive periods, violating state consumer protection laws.
Key Allegations in the Lawsuit
The lawsuit, filed in Hennepin County District Court, centers on claims that TikTok’s features—such as infinite scroll, personalized content recommendations, and frequent notifications—are engineered to create compulsive usage patterns. Minnesota argues these tactics disproportionately affect children and teens, contributing to rising rates of anxiety, depression, and social media addiction.
- Addictive Design: The suit claims TikTok’s algorithm prioritizes user retention over safety, using “dopamine-manipulating” tactics to keep young people hooked.
- Data Collection: Minnesota alleges TikTok collects sensitive data on users under 13 without proper parental consent, potentially violating the federal Children’s Online Privacy Protection Act (COPPA).
- Misleading Safety Claims: The state argues TikTok downplays its platform’s risks while promoting itself as safe for minors.
Legal Basis and Potential Penalties
The case invokes Minnesota’s Deceptive Trade Practices Act, which prohibits businesses from engaging in fraudulent or harmful practices. If successful, TikTok could face fines, operational restrictions in the state, and mandates to redesign its algorithms and data practices. The lawsuit also seeks restitution for Minnesota families affected by the platform’s alleged harms.
TikTok’s Response
In a public statement, TikTok denied the allegations, stating, “We prioritize the safety and well-being of our community, particularly teens, through features like screen-time limits and parental controls.” The company emphasized its efforts to comply with COPPA and collaborate with experts to address mental health concerns.
Broader Implications
Minnesota’s lawsuit adds to growing legal challenges facing social media companies. Multiple states and the federal government have scrutinized platforms like Meta (Facebook/Instagram) and Snapchat for similar concerns. The case reflects wider debates over regulating algorithmic transparency and holding tech firms accountable for user harm.
Legal experts note the outcome could influence nationwide policies, including proposed federal laws like the Kids Online Safety Act (KOSA), which seeks to mandate stricter safeguards for minors on digital platforms.
What’s Next?
The court will likely evaluate whether TikTok’s design practices meet the legal threshold for consumer harm. Minnesota’s case may also prompt other states to pursue comparable action, amplifying pressure on social media companies to reform their engagement-driven business models.
For now, the lawsuit underscores the escalating clash between regulators advocating for child safety and tech platforms defending their operational autonomy.
