Key Allegations by Florida AG Utheimer
The complaint filed by Florida alleges the following key misconduct by Roku:
- Roku is accused of collecting and selling or enabling re-identification of sensitive personal data gathered from children, including viewing habits, voice recordings, device interactions, and location-based information—all without meaningful notice to parents or express parental consent.
- Despite knowing that many users of its platform are children, Roku allegedly failed to implement industry-standard user profiles or effective age-verification controls to distinguish child users from adults. The complaint states Roku “buries its head in the sand so that it can continue processing and selling children’s valuable personal and sensitive data.”
- Roku allegedly misrepresented the effectiveness of its privacy controls and opt-out tools for children’s data processing, giving families a false sense of protection.
- It is further claimed that Roku entered into partnerships with third-party data brokers in order to monetise children’s personal data, circumventing protections that might otherwise apply under Florida law. For example, the complaint names the broker Kochava as one such partner.
Regulatory Framework in Play
The Florida Digital Bill of Rights, passed in 2023, grants consumers — particularly parents and guardians — enhanced control over how personal and sensitive data is collected, processed, sold or shared, especially when children are involved. Among other requirements, it mandates clear consent, meaningful disclosures, and special safeguards when children’s data is processed for sale or profiling.
In parallel, FDUTPA empowers the Attorney General to pursue unfair or deceptive practices in trade or commerce. By combining these statutes, the state is targeting both the substantive data-privacy obligations (via FDBOR) and the broader consumer-protection regime (via FDUTPA).
Why This Case Matters
This enforcement action carries significance on multiple fronts:
First, it signals strong regulatory focus on children’s data and the television/streaming ecosystem—areas where device makers, platforms, apps and ad-tech converge. Schools, households, and families now expect tech companies to treat children’s data differently.
Second, it demonstrates state-level privacy enforcement gaining momentum. While federal laws such as the Children’s Online Privacy Protection Act (COPPA) remain important, states are now layering in their own regimes and acting aggressively on consumer-protection statutes.
Third, the case highlights accountability for complex ecosystems. Roku’s platform spans hardware, apps, streaming, partner channels and ad-tech, meaning the responsibility for age verification, data-sharing, consent flows, and monetization may overlap multiple stakeholders. Regulators expect clear roles, documented controls and transparent vendor/connectivity chains.
Finally, the action sets a precedent: merely having privacy controls is not enough—companies must demonstrate they work, are monitored, assess risk, and are applied continuously. The complaint emphasizes that Roku “should have been aroused to question the age of its users,” underlining the expectation for proactive governance.
Detailed Timeline & Scope
While the complaint document itself (filed in Florida’s 20th Judicial Circuit) remains sealed publicly, the press release and news coverage provide key timeline elements and operational scope:
- Roku reports it serves over 145 million U.S. users as of 2024, making it the largest streaming-platform distribution service in the U.S.
- The enforcement notice appears to target processing of children’s data over multiple years, alleging ongoing sale or sharing of behavioral and device data from minors without consent or effective age-segmentation.
- The complaint includes charges that Roku’s business architecture allowed third-party data brokers to obtain children’s information. The broker Kochava is explicitly referenced.
- Roku’s ability to track devices across viewing sessions, applications, device types, and ad-tech integrations is cited as enabling re-identification of children’s profiles after anonymization filters. Such re-identification of “sensitive personal data” is central to the state’s arguments.
Implications for Businesses and Compliance Teams
For organizations operating connected-devices, streaming platforms, internet-of-things (IoT) ecosystems, ad-tech stacks or children-facing services, this case offers several actionable implications:
• If your product is accessible to minors or you process data on users who may be under 18 (or under state-specified ages), you must implement reliable age-verification, segmentation of child vs adult profiles, and tailored consent flows for minors.
• Audit your data-broker, advertising and partner-ecosystem relationships to ensure children’s profiles are not being shared, sold, or repurposed for behavioural advertising or profiling without verifiable parental consent and effective opt-out mechanisms.
• Review your privacy controls and claims critically. Regulators are asking: Do these controls work in practice? Are they subject to testing, audit and evidence? Are they effective for children? Misleading parents on “opt-out” or “we do not sell children’s data” may expose you to liability.
• Design your governance around layered responsibilities: hardware/OS vendor, platform provider, channel/app store, ad-tech, device manufacturer. Document where age-assurance, consent capture, data retention and sharing responsibilities lie—and evidence your internal monitoring.
• Ensure your incident-response and regulator-engagement playbooks include children’s-data scenarios. Since states are increasingly wielding consumer-protection laws (like FDUTPA) alongside data-privacy statutes, your remediation plans should extend beyond a breach to policy, transparency and contractual failures.
How Captain Compliance Can Support You
At Captain Compliance, we specialize in assisting companies to build privacy programs that meet today’s heightened regulatory data privacy and compliance expectations, especially in the context of children’s data, multi-jurisdictional obligations and complex device/ad-tech ecosystems. Think of those privacy notices that are out of date? We fix that and we help with the cookie consent banner disclosures.
Our offering includes:
- A dedicated “Children’s Data Compliance” module—mapping whether minors can access your service, reviewing age-profiles, consent flows, third-party data sharing and tracking for minors.
- Cross-jurisdictional privacy risk assessments—covering state-level laws (including FDBOR), federal laws (COPPA), and international regimes (GDPR, UK GDPR) so you have a holistic view of your obligations.
- Vendor/Ad-Tech risk reviews—identifying and remediating children’s-data exposures in your supply chain, reviewing broker relationships, profiling flows and data monetisation paths.
- Governance & documentation readiness—building audit-ready records of processing activities, DPIAs, parental-control dashboards, consents, opt-out logs and vendor risk registers tailored for children’s data risks.
- Incident & regulatory-response preparation—helping you simulate children’s-data breach or misuse scenarios, develop evidence logs, notification templates and business continuity plans specific to this risk domain.
In a world where children’s data is high risk and regulators are ready to act, Captain Compliance offers you the structure, documentation and operational readiness to stay ahead of enforcement and protect your brand from expensive regulatory fines.
Get Protected Now If You Operate In Florida
Florida’s enforcement action against Roku underscores an important shift: regulators are moving beyond generic device privacy concerns and focusing squarely on how children’s data is processed, shared and monetized in complex digital ecosystems.
Simply having a “kids mode” or checkbox is no longer sufficient—expectation now centres on demonstrable age-verification, parental involvement, transparent disclosures, vendor controls, and continuous auditing.
If your organisation touches children’s data—or sits at the intersection of streaming, device, ad tech or platform services—you must revisit your risk model, controls and governance. Engaging a specialist partner like Captain Compliance can help you build the defensible programme regulators expect—and avoid being the next headline.