A closer look at what the agency is prioritizing as new rules kick in and enforcement heats up
This piece draws from remarks by FTC Associate Director Ben Wiseman at a recent IAPP event, along with details on ongoing cases and upcoming rules.
The Federal Trade Commission isn’t easing up on children’s online privacy. If anything, things are picking up speed under the new leadership. A top official just laid out where the agency is putting its energy, and it’s clear: updated rules, deepfake laws, age checks, and a string of recent cases are all signaling tougher scrutiny ahead.
Ben Wiseman, associate director in the FTC’s Division of Privacy and Identity Protection, spoke at an IAPP meeting on January 21. He made it plain—the division’s day-to-day work is all about privacy and data security, and kids’ issues are front and center right now.
The Updated COPPA Rule Takes Center Stage
A lot of the focus is on the big overhaul of the Children’s Online Privacy Protection Act Rule—COPPA for short. The FTC finished amending it in 2024, the first major update since 2013, and companies have until April 22, 2026, to get compliant.
The changes aren’t small. One of the biggest is requiring separate, verifiable parental consent before sites or apps can share kids’ personal info with third parties for targeted advertising. Parents now get a clear chance to say no.
Another key piece: limits on how long companies can hold onto children’s data. No more keeping it indefinitely “just in case”—retention has to be tied to a real purpose, and companies need policies to delete it when that purpose ends.
Wiseman called this a priority area and hinted at new guidance coming soon. “Stay tuned,” he said. For anyone running a site or app that collects info from kids under 13, that’s code for get your house in order.
Why the push? Kids are online earlier and more than ever—apps, games, social platforms, educational tools. Data collection has exploded, and parents want more control. The old rules just weren’t keeping up with how companies monetize that info.
The TAKE IT DOWN Act and Deepfake Threats
Another new tool in the FTC’s kit is the TAKE IT DOWN Act, signed into law back in May 2025. It goes after nonconsensual intimate deepfakes—those AI-generated fake images or videos that can devastate lives, especially for young people.
The law makes publishing that kind of content a crime, but it also puts civil obligations on platforms: they have to take it down quickly when victims report it, and set up clear processes for handling requests.
Wiseman said the FTC will handle the civil side—the Department of Justice gets the criminal cases. But the agency is ready to enforce from day one, even though full mandatory removal doesn’t start until May 2026. Some platforms have already begun voluntary takedowns.
This isn’t abstract. Reports of deepfake harassment in schools spiked in recent years, with teens targeted in cruel ways. The law gives victims a faster path to getting harmful content removed, and the FTC wants to make sure companies follow through.
Age Verification: The Next Big Debate
The FTC is also digging into age verification tech. They’re holding a workshop on January 28, bringing together experts, advocates, companies, and regulators to talk through what’s working and what isn’t.
Questions on the table: How accurate are these tools? What privacy risks do they create—uploading IDs or biometric scans just to prove you’re old enough? How do they fit with COPPA and other laws?
Wiseman noted the tech is “really emerging.” Some states have already passed kids’ safety laws requiring age checks for certain sites, and more are coming. The workshop will help shape how the FTC thinks about enforcement and guidance.
It’s tricky territory. On one hand, better age gating could keep kids off adult content or addictive platforms. On the other, clunky or invasive systems drive users away or collect even more sensitive data.
Recent Cases Show Where the Line Is Drawn
Wiseman pointed to a few ongoing and settled cases as examples of what catches the FTC’s eye.
One big one is the joint lawsuit with Utah against operators of adult websites. The complaint alleges the sites didn’t properly verify ages or block kids from accessing explicit material. That case is still working its way through court.
Another: a $20 million settlement with Cognosphere, the company behind the popular game Genshin Impact. The FTC said the game let kids under 16 make in-game purchases without real parental consent, racking up big bills. The deal requires blocking those transactions unless parents explicitly approve.
These aren’t isolated. The FTC has hit companies with record COPPA fines in recent years—hundreds of millions in some cases—for lax consent, excessive data collection, or poor security.
Wiseman’s advice was straightforward: watch the complaints and settlements closely. They show what commissioners see as problems in the market right now.
What This Means for Companies
If your business touches kids’ data—games, edtech, social apps, toys with mics or cameras, even general sites that attract underage users—you’re in scope.
Practical steps coming out of this: review consent flows, tighten data retention policies, build better parental controls, and think hard about third-party sharing. For platforms hosting user content, set up solid reporting and removal systems for deepfakes.
Smaller companies might feel squeezed. Compliance isn’t cheap—audits, new tech, legal reviews. But the alternative is worse: fines, bad press, or lawsuits.
Wiseman stressed proactive measures. Get ahead of the guidance, not behind it.
The Bigger Picture for Kids’ Safety Online
This all fits into a broader wave. States like California and Utah have passed their own kids’ privacy and safety laws. Congress has debated federal updates. Platforms face pressure from parents, schools, and advocates to do more.
Deepfakes, addictive algorithms, targeted ads aimed at kids—these aren’t edge cases anymore. They’re everyday risks. And with AI making fake content easier to produce, the problems are growing faster than solutions in some areas.
The FTC under Chair Andrew Ferguson seems committed to using its tools aggressively. Enforcement isn’t just about punishment; it’s about setting expectations for the whole industry.
Parents get more say, kids get more protection, and companies get clearer lines—though crossing them will cost more than it used to.
Kids Safety – Hire Captain Compliance to Get Compliant
Short term: that age verification workshop and any guidance on the new COPPA pieces. Longer term: how aggressively the FTC goes after violations once the April deadline passes.
If recent actions are any guide, expect more settlements, more complaints, and probably a few high-profile cases to make points.
For anyone in the space, the message is clear—kids’ privacy isn’t optional, and the FTC is paying close attention.