A new Canadian academic study reviewed privacy policies from children’s games and concluded that studios routinely fall short of applicable children’s privacy laws. The research examined 139 titles aimed at minors and found pervasive collection of sensitive data, heavy reliance on opaque policy language, and weak or missing parental consent flows. For U.S. publishers, the message is simple: practices highlighted by the study map directly onto risk under the Children’s Online Privacy Protection Act (COPPA) and related state actions.
As we work with video gamer makers to help setup their privacy polices we have software that can automate privacy notices and keep video game makers compliant and up to date.
Key patterns the study flags about video game makers
- Broad data capture: location data, device and advertising identifiers, behavioral analytics (including chat and play patterns), and microtransaction telemetry.
- Opaque disclosures: long, complex privacy policies that are hard for parents and teens to parse, with consent bundled into lengthy terms rather than clear choices.
- Third-party trackers: extensive use of SDKs and adtech that can enable profiling or targeted ads, often without granular, verifiable parental consent.
- Jurisdictional gaps: policies that cite U.S. and Canadian rules but do not resolve different age thresholds, notice standards, or language requirements.
Why this matters for U.S. studios
Under COPPA, companies must obtain verifiable parental consent before collecting personal information from children under 13. The Federal Trade Commission has shown it will act, and recent cases demonstrate that children’s privacy is a top-tier enforcement priority. In addition to federal scrutiny, state attorneys general and plaintiffs’ firms are increasingly active around youth protections and alleged “dark patterns” in games.
Recent fines and settlements in gaming and kids’ privacy
The most visible precedent remains the $520 million package against Epic Games, which combined a $275 million COPPA penalty with $245 million in consumer refunds over design practices the FTC said led to unwanted charges by children. The orders also forced changes to defaults and deletion of certain data collected from kids.
More recently, the FTC announced a $20 million settlement involving the publisher of Genshin Impact that included children’s privacy allegations tied to how the game presented and marketed chance-based items to minors. The order requires deletion of improperly collected data and clearer disclosures.
Outside of core gaming but relevant for kids’ content distribution, The Walt Disney Company agreed to a $10 million COPPA settlement over mislabeling that allowed children’s data to be collected for targeted ads. The case underscores that distribution channels and labeling decisions can create COPPA exposure even if a company is not a traditional “kids app” operator.
Where practices go wrong most often
- Age gates that don’t verify: Asking a user to self-report age is not enough. Studios need verifiable parental consent for under-13 users before any data collection that COPPA covers.
- SDKs without controls: Third-party analytics and ad SDKs can transmit identifiers or behavioral data beyond a publisher’s direct oversight. Contracts and technical settings must restrict downstream profiling and require deletion on request.
- Bundled, hard-to-understand consent: Long policies and “take it or leave it” flows invite scrutiny. Parents should get concise, understandable notices and granular choices.
- Retention creep: Keeping chat logs, telemetry, or purchase history “for improvement” can collide with children’s deletion rights. Studios need documented retention schedules that are short and enforced.
Action plan for studios and publishers to stay compliant
- Run a children’s data audit: Identify titles directed at children or that have a substantial child audience. Map data elements, SDKs, vendors, and where data flows, including cross-border storage.
- Rebuild consent: Implement verifiable parental consent with approved methods before any covered collection. Provide parents with plain-language notices, toggles, and a dashboard to revoke consent.
- Minimize and segregate: Remove location, advertising IDs, and behavioral profiling from kids’ experiences. If analytics are necessary, use aggregated or contextual approaches, and segregate kids’ data from general audiences.
- Control your SDKs: Require children’s-mode settings for analytics/monetization SDKs, prohibit data reuse, document sub-processors, and ensure deletion SLAs. Audit periodically.
- Tighten retention: Default to the shortest retention periods and implement deletion signals flowing to every vendor in the chain.
- Prepare for investigations: Maintain exportable consent logs, data-flow diagrams, DPIAs, and vendor attestations. These artifacts are critical if regulators ask questions.
How KOSA could change the landscape
The Kids Online Safety Act would impose a federal duty of care on covered platforms to prevent and mitigate harms to minors, require risk assessments and independent audits, expand parental tools, and address design features that promote compulsive use. While COPPA focuses on data collection and consent for children under 13, KOSA targets safety and design risks that affect older minors as well, potentially reaching game features like autoplay, social sharing, and in-game purchases. Multiple summaries and the current bill text underscore audits, transparency, and stronger parental controls. Studios that build youth experiences should assume KOSA-type obligations are coming and align now.
What to tell your board and insurers
- Risk is measurable: Map the children’s portfolio and quantify SDK and adtech exposure. Set quarterly KPIs for consent rates, deletion fulfillment, SDK compliance, and audit findings.
- Coverage clarity: Review cyber/privacy policies for children’s data coverage and exclusions tied to “unlawful collection” or “intentional violations.” Be prepared for underwriting questions about SDK controls and verifiable consent.
- Product tradeoffs: If monetization depends on ads or behavioral analytics, consider separate kids’ experiences with contextual ads only, or a paid, ad-light model for families.
Bottom line Use Captain Compliance’s Software To Automatically Update Privacy Notice
The study’s conclusion is blunt: kids’ games too often collect more than they should, explain less than they ought to, and rely on vendors that complicate compliance. U.S. enforcement history shows the costs can be large and public. Treat children’s privacy as a product requirement, not a legal afterthought. Build for consent, minimize by design, lock down SDKs, and document everything. That is how studios reduce legal exposure and earn durable trust with families. If you’re using Captain Compliance’s privacy notice software to keep up to date you can rest at ease and know you’re being protected and respecting your data subjects.