In a move that could reshape the national conversation around youth and technology, Gavin Newsom has signaled support for stricter social media restrictions for teenagers under the age of 16. The proposal, still taking shape in legislative channels, reflects mounting bipartisan concern over the psychological, developmental, and safety impacts of algorithm-driven platforms on minors.
California — home to Silicon Valley and many of the world’s largest social media companies — now finds itself at the center of a regulatory push that could redefine the digital age of consent in the United States.
Gavin Newsom has thrown his support behind new legislation aimed at restricting social media access for teenagers under 16, signaling a major shift in how California approaches youth digital safety. The proposal would require parental consent for younger teens and place tighter limits on algorithmic features widely criticized for promoting addictive behavior. Newsom framed the move as a public health issue rather than a political one, pointing to growing evidence linking excessive social media use to anxiety, depression, and sleep disruption among adolescents. By backing the measure, California is positioning itself at the forefront of a national effort to recalibrate the balance between innovation and child protection.
The governor’s endorsement also places fresh momentum behind broader federal discussions, including the proposed Kids Online Safety Act (KOSA), which would impose a statutory duty of care on platforms to mitigate harms to minors. While critics warn about enforcement challenges and potential First Amendment implications, supporters argue that tech companies have long operated without meaningful guardrails when it comes to children. If enacted, California’s under-16 framework could become a model for other states — and further intensify pressure on social media platforms to redesign products with youth safety at the core rather than as an afterthought.
A Push Toward Parental Consent and Age Verification
The emerging framework under consideration would require parental consent for users under 16 and could impose new constraints on addictive design features, algorithmic recommendation engines, and push notifications targeted at younger audiences. Supporters argue the proposal is less about prohibition and more about recalibrating the power dynamic between platforms and families.
Newsom’s backing is notable because California has historically walked a careful line between protecting innovation and imposing guardrails. However, growing research tying excessive social media use to anxiety, depression, sleep disruption, and body image issues among teens has shifted political momentum.
Lawmakers in Sacramento are studying mechanisms ranging from default time limits to stricter age-verification protocols, though enforcement remains one of the thorniest challenges. Privacy advocates warn that aggressive identity verification systems could introduce new data-collection risks, particularly if platforms require government ID uploads or biometric scans.
The Federal Context: KOSA and the Duty of Care Debate
At the federal level, similar concerns have crystallized around the proposed Kids Online Safety Act (KOSA), which would impose a statutory “duty of care” on online platforms to mitigate harms to minors. The bill has generated both strong support and pointed criticism.
Proponents of KOSA argue that platforms have long profited from engagement-maximizing algorithms without adequate safeguards for children. The legislation would require companies to:
- Conduct risk assessments related to harms to minors
- Provide parents with supervision tools
- Limit algorithmic amplification of harmful content
- Offer minors stronger privacy protections
Critics, however, contend that the bill’s language around “harmful content” could open the door to overbroad content moderation or unintended censorship, particularly affecting marginalized communities. Free speech organizations and digital rights groups remain deeply divided on whether KOSA strikes the appropriate balance.
California’s potential under-16 restrictions could function as a state-level complement to KOSA — or, depending on how they are drafted, a more aggressive alternative.
Europe’s Stricter Framework: GDPR and Age of Digital Consent
Globally, the European Union has already established firmer rules governing minors’ data through the General Data Protection Regulation (GDPR). Under GDPR, the default age for digital consent is 16, though member states may lower it to 13. Many countries have adopted the 15 or 16 threshold, effectively requiring parental consent for younger teens to access certain data-processing services.
Beyond consent rules, Europe is layering additional protections under the Digital Services Act (DSA), which restricts targeted advertising based on profiling minors and increases transparency obligations for algorithmic systems.
The European regulatory philosophy emphasizes structural accountability: limiting behavioral advertising, enhancing transparency, and requiring documented risk assessments. California’s debate appears to be drawing inspiration from this compliance-first model.
The UK’s “Age-Appropriate Design” Model
In the United Kingdom, the Information Commissioner’s Office (ICO) has enforced the Age-Appropriate Design Code, sometimes called the Children’s Code. It mandates high privacy settings by default for minors and restricts the use of “nudge” techniques that encourage children to weaken their privacy protections.
Rather than outright banning access, the UK model focuses on design constraints — limiting autoplay features, disabling geolocation by default, and preventing profiling unless demonstrably necessary.
This “safety by design” philosophy is gaining traction in U.S. policy circles. California has already experimented with similar frameworks through prior legislation, and Newsom’s support for new under-16 restrictions suggests a willingness to extend that approach.
Australia’s Aggressive Enforcement Posture
Australia has taken a particularly assertive stance through its eSafety Commissioner, an independent regulator empowered to compel removal of harmful online content and enforce child-protection standards. Lawmakers there have floated proposals to raise the minimum age for social media accounts and impose significant penalties for non-compliance.
The global pattern is unmistakable: advanced economies are moving away from self-regulation toward codified, enforceable obligations.
Industry Response and Economic Implications
For social media platforms, California’s move poses both operational and financial implications. Teens represent a critical demographic for long-term user acquisition. Restricting access under 16 could materially affect growth metrics, advertising revenue, and engagement benchmarks.
Technology companies are expected to argue that parental empowerment tools already exist and that education — not restriction — is the appropriate solution. Industry lobbyists will likely press concerns about constitutional challenges, particularly under First Amendment doctrine.
Legal scholars anticipate that any California law imposing outright age-based bans could trigger immediate court scrutiny. Previous state attempts at regulating online platforms have faced constitutional headwinds.
Enforcement and Privacy Trade-Offs
The most technically complex issue remains age verification. Effective enforcement requires reliable age determination, yet stronger verification often entails collecting more personal data — a paradox in legislation intended to enhance privacy.
Biometric verification, AI-driven facial age estimation, and third-party digital ID providers are among the technologies under consideration. Each introduces separate compliance and civil liberties questions.
Data minimization principles under existing privacy frameworks could clash with aggressive age-verification mandates, forcing regulators to reconcile child protection goals with consumer privacy safeguards.
A Turning Point in Youth Digital Policy
Newsom’s endorsement signals that youth social media regulation is no longer a fringe policy discussion. It is rapidly becoming a core governance issue intersecting public health, constitutional law, data protection, and platform economics.
Whether California ultimately enacts a strict under-16 limitation or a more nuanced parental consent regime, the state’s position carries outsized influence. As the regulatory home of Silicon Valley, its policies often ripple nationally and globally.
With federal legislation like KOSA advancing and international frameworks tightening, the era of largely unregulated youth access to algorithmic platforms appears to be closing.
The question is no longer whether governments will intervene — but how far they will go, and how quickly technology companies can adapt to a world where protecting minors is not merely a public relations promise, but a legally enforceable obligation.