A federal judge has signaled that claims over TikTok’s handling of children’s data can move forward. The dispute centers on whether the platform collected and used personal information from under-13 users without meeting stricter children’s privacy requirements. The ruling keeps key allegations alive and pushes the case toward discovery rather than ending it at the pleadings stage.
The latest TikTok privacy case at a glance
- The core issue: Collection and use of personal information from children under 13 without the kind of verified parental consent and safeguards that youth privacy rules demand.
- What moved the court: Plaintiffs plausibly alleged that under-13 users could access full features, that the app gathered various identifiers and profile details, and that controls meant for kids were not consistently applied.
- What this means now: The lawsuit will proceed. The parties head into fact-finding where data flows, product settings, and internal practices typically come under a microscope.
What’s being alleged
The complaint describes a product experience where children could create or maintain standard accounts and where the service collected names, contact details, device and location signals, and behavioral information. The suit also challenges whether age-gating, consent flows, and parental involvement were adequate and consistently enforced.
Why the court didn’t shut it down early
At this stage a judge asks a narrow question: do the allegations, if true, state a viable claim. The answer here was yes. That doesn’t mean the plaintiffs have proven their case; it means they’ve alleged enough to investigate. The next phase will test what actually happened, how data moved through systems, and how safeguards were designed and implemented.
Why this matters beyond one platform
- Youth privacy remains a top enforcement priority. Regulators and courts continue to scrutinize how mainstream apps manage children’s profiles, signals, and features that can be attractive to younger users.
- Design choices carry legal weight. Age screens, default settings, data minimization, and “kids modes” are not cosmetic. If they’re porous in practice, exposure grows quickly.
- Discovery pressure is real. Once a case proceeds, companies often must produce product docs, engineering notes, consent records, and vendor contracts. That fuels risk assessments, settlement talks, and potential product changes.
Trends privacy professionals are watching
- Age assurance that actually works: Clearer signals and layered checks to keep under-13 users in dedicated experiences with tighter data limits.
- Data minimization for kids by default: Collect less, retain for shorter periods, and cut third-party sharing pathways that are not essential to core functionality.
- Verified parental involvement where required: Mechanisms that reliably tie a parent or guardian to a child’s account when the law calls for it.
- Marketing and recommendations for minors: Careful controls around targeted features that could be viewed as profiling or ad targeting of children.
- Vendor and SDK hygiene: Contracts and technical controls that prevent downstream partners from repurposing kids’ data.
What to watch next
- Discovery outcomes: Evidence about how age-gating, consent, and data flows worked in practice will shape the path forward.
- Potential remedies: Settlements in this space often include product commitments, independent assessments, and restrictions on certain data uses for minors.
- Ripple effects: Other platforms may adjust default settings and developer integrations for youth accounts to reduce similar exposure.
Children’s data controls
The court’s signal is straightforward: children’s data controls must be built for real-world use, not just policy pages. As this case advances, it will keep attention on age assurance, parental consent, and data minimization for young users. Expect more scrutiny of how “kids modes” are constructed, how easy it is to bypass them, and whether data practices match the promises made to families.