European policymakers and digital economists are intensifying debates over how to regulate social media platforms, not just for market fairness but also to protect young users from harm. Against a backdrop of rising global concern about children’s online safety, a recent working paper from Bruegel (WP 02 2026) explores how digital platforms function as both economic actors and social ecosystems—highlighting the need for new policy frameworks that balance innovation with responsibility.
A Digital Economy at a Crossroads
At the heart of the discussion is a fundamental tension: digital platforms generate tremendous economic value while also posing unique risks for vulnerable users, especially minors. Platforms like Meta’s Facebook and Instagram, TikTok and YouTube have transformed social interaction, yet their algorithmic systems can expose children to harmful content, addictive design, and privacy erosion. This duality underscores why digital markets are not traditional markets—they combine social influence, data-driven advertising, and behavioral economics in ways that defy classic regulation. Public policy must therefore evolve to address these hybrid dynamics.

The Child Protection Imperative in EU Digital Regulation
Europe’s regulatory strategy has increasingly foregrounded children’s rights in the digital environment. The Digital Services Act (DSA) and General Data Protection Regulation (GDPR) already require platforms to limit targeted advertising to minors and ensure high levels of data privacy, but recent proposals signal a deeper shift. In early 2026, the European Commission outlined a plan that would explore a digital age of majority—effectively recommending age-16 thresholds for accessing certain services unless parental consent is given. (
These proposals reflect growing concern over the “attention-capture economy,” where algorithmic feeds maximize engagement—and, potentially, exposure to harmful content—especially among younger users. The Bruegel paper linked to these debates by arguing that purely market-driven platforms may not adequately internalize social risk without clear legal duties and accountability mechanisms.
Why Platform Regulation Matters More Than User Prohibitions
One key policy question illuminated by recent European analyses is whether age bans or stricter access rules are the best way to protect youth. Some EU member states, influenced by Australia’s Social Media Minimum Age Act, are considering bans or age floor rules similar to Australia’s under-16 prohibition.
However, critics—including the Council of Europe’s human rights commissioner—warn that focusing solely on access restrictions puts the burden on children to avoid harm rather than holding platforms accountable for harmful design choices. They argue that platforms should be required to implement safety-by-design principles, risk assessments, content moderation transparency, independent audits, and enforceable sanctions for failures to protect children’s rights.
This shift—from restricting user access to regulating platform architecture—is central to emerging policy thinking in Brussels and beyond. Policymakers increasingly see platform design, advertising revenue structures, and algorithmic recommendation systems as core policy levers, rather than only user permissions or parental controls.
The Three Pillars of a Child-Centered Digital Strategy
The Commission’s broader child safety agenda outlined in early 2026 emphasizes a multi-dimensional approach that goes beyond narrow legislative fixes. It rests on three pillars:
- Protection: Strengthening reporting systems and enforcement of existing laws, especially for harmful content such as cyberbullying and exploitation.
- Prevention: Supporting digital literacy, teacher training, and awareness programs that help children, educators, and parents understand digital risks.
- Participation: Involving children directly in policy design and evaluation, recognizing their role as both rights holders and active digital citizens.
This holistic framework aims to unify legal regulation with education and empowerment—acknowledging that children use digital spaces for social, creative, and civic interactions that legislation alone cannot suppress.
Age Verification and Its Practical Challenges
One flashpoint in the current regulatory landscape is age verification. While age gates and minimum age requirements are politically popular, they raise significant implementation challenges. Research shows children can easily misrepresent ages or use VPNs to circumvent restrictions.
From a data protection perspective, requiring more personal data for verification—such as identity documents or biometric scans—can paradoxically increase privacy risks. Policymakers are therefore exploring alternatives, including privacy-preserving verification technologies and tiered social participation models that match access levels to developmental stages.
A Policy Framework for Child Safety and Innovation
As Europe’s digital strategy evolves, policymakers face a series of trade-offs:
- How to protect minors from harmful content without isolating them from beneficial online social and educational experiences.
- How to balance privacy and safety, especially when verification mechanisms require personal data.
- How to impose accountability on global platforms while preserving competition and innovation in digital markets.
The Bruegel paper highlights that purely ad-hoc fixes—whether bans or simple age limits—will not suffice. Instead, a regulatory architecture is needed that embeds child safety into the market structure itself, from algorithmic design rules to enforceable compliance mechanisms that can adapt as technology evolves.
What’s Next for Europe—and the World
Europe’s debates are already influencing policymakers globally, from Australia’s age ban to U.S. federal proposals like the Kids Online Safety Act. While legislative paths differ—some focus on liability and duty of care, others on age limits or advertising controls—there is growing consensus that the digital age of childhood cannot be left entirely to market forces.
For governments, the challenge going forward will be designing smarter, evidence-based regulation that protects children while preserving the benefits of digital participation. The conversation is no longer just about banning harmful content; it’s about shaping markets so that platforms compete not just on growth and engagement, but on responsibility and human rights outcomes—a shift that could redefine digital governance in the decade ahead.
- Effective digital regulation must go beyond age bans to address platform accountability and design choices.
- Regulators should emphasize risk assessments, transparency, and enforceable sanctions for youth safety failures.
- Age verification strategies must balance child protection with privacy safeguards.
- A broader education and participation strategy is essential for long-term resilience and awareness.
Europe’s emerging framework is poised to become a global reference point—blending legal, technological, and social policy tools to protect the next generation growing up online.