Dublin, Ireland — In a bold move to enforce Europe’s stringent online safety rules, Ireland’s media watchdog, Coimisiún na Meán (CNM), has launched formal probes into social media giants TikTok and LinkedIn, accusing them of deploying confusing “dark patterns” that hinder users from flagging illegal content.
The investigations, announced on December 2, 2025, zero in on potential breaches of the EU’s Digital Services Act (DSA), Europe’s landmark legislation aimed at curbing online harms. At issue are the platforms’ “Notice and Action” mechanisms—tools designed to let users report suspected illegal material swiftly and anonymously. Regulators suspect these features are neither user-friendly nor effective, potentially allowing dangerous content to proliferate unchecked.
John Evans, CNM’s Digital Services Commissioner, didn’t mince words: “The Digital Services Act has marked a step change for online safety in Ireland, and across the EU, providing people who use online providers with greater rights, and placing new obligations on providers to keep people safe online.” He highlighted that core to the DSA is the right to report illegal content through accessible, straightforward interfaces—obligations he alleges TikTok and LinkedIn may be flouting.
The probes stem from a sweeping September 2024 review of compliance among Ireland-based online intermediaries. That audit uncovered red flags: reporting buttons buried in labyrinthine menus, ambiguous language blurring the line between Terms of Service violations and outright illegal acts, and designs that nudge users away from escalating reports. For child sexual abuse material (CSAM)—a DSA priority—anonymous flagging options appear conspicuously absent, a glaring oversight that could deter victims or witnesses from coming forward.
Under DSA Articles 16(1) and 16(2)(c), platforms must ensure reporting tools are easy to find and operate, including anonymous submissions for CSAM. Article 25 adds a layer of scrutiny, banning manipulative interface designs that distort user choices. CNM’s team, empowered by Ireland’s Broadcasting Act 2009, is now demanding detailed disclosures from both companies, with findings potentially triggering fines up to 6% of global annual turnover—a penalty that could run into billions for ByteDance-owned TikTok and Microsoft-backed LinkedIn.
Dark Patterns in the Spotlight: A Growing EU Crackdown
The term “dark patterns” has become a regulatory buzzword, referring to sneaky UI tricks that exploit cognitive biases to favor platforms over users. On TikTok, for instance, reports suggest the app’s viral video feeds make it a hunting ground for predators, yet its reporting flow allegedly funnels complaints into generic “community guidelines” buckets rather than dedicated illegal content lanes. LinkedIn, pitched as a professional network, faces parallel heat: professional misconduct reports might mask deeper issues like harassment or hate speech, with interfaces that overwhelm users with legalese.
This isn’t isolated. Since the DSA’s full enforcement in February 2024, EU coordinators like CNM have ramped up oversight, especially for Ireland-domiciled tech hubs hosting 11 of the world’s 15 largest platforms. TikTok, already under fire for youth safety lapses, faces a separate €345 million fine from Ireland’s Data Protection Commission in 2023 over child data handling. LinkedIn, meanwhile, has weathered GDPR scrutiny but now confronts DSA’s behavioral guardrails.
Evans noted progress elsewhere: Several unnamed providers have overhauled their systems post-engagement, tweaks now under CNM’s microscope for real-world efficacy. “Our message is clear,” he said. “We expect providers to comply with their obligations under the DSA and to engage with us when making changes to their reporting mechanisms.”
Broader context underscores the urgency. A 2025 Eurobarometer survey revealed 62% of EU users encountered suspected illegal content online, yet only 28% knew how to report it effectively. CSAM reports to hotlines like Ireland’s ISPCC surged 25% year-over-year, straining under-resourced systems. The DSA’s teeth—coordinated via the European Commission’s oversight of “very large online platforms” (VLOPs)—aim to flip this script, but enforcement lags behind ambition.
Stakes High: Fines, Fixes, and User Empowerment
If substantiated, violations could culminate in administrative penalties or “Commitment Agreements,” where platforms pledge binding reforms under penalty of escalation. CNM isn’t stopping at TikTok and LinkedIn; further info requests to other Irish-based entities signal a widening net. Collaboration with the Commission ensures VLOP alignment, while national coordinators like France’s ARCOM and Germany’s NetzDG amplify cross-border pressure.
For users, CNM offers lifelines: Direct reporting to platforms remains paramount, but stumbling blocks? The agency’s Contact Centre stands ready, with escalation to its Complaints Team for unresolved gripes. Evans urged: “For people in Ireland and across the EU who use online providers that are based in Ireland, we would encourage you to report suspected illegal content to the provider where you see it. If you can’t find an easy way to do this, or if you’re not happy with a platform’s response, our Contact Centre can provide advice and support.”
Industry voices are muted so far—TikTok and LinkedIn have yet to comment publicly—but precedents loom large. Meta’s €1.2 billion GDPR slap in 2023 and X’s ongoing DSA skirmishes paint a cautionary canvas. As Ireland cements its role as Europe’s “Silicon Docks” enforcer, this dual probe signals a no-holds-barred era: Platforms must illuminate dark corners, or face the DSA’s unblinking glare.
In a digital landscape where algorithms amplify harms faster than humans can react, CNM’s actions reaffirm the EU’s commitment to user-centric safety. As investigations unfold—potentially into 2026—the onus is on TikTok and LinkedIn to prove their mechanisms aren’t just performative, but protective. For now, the message to Big Tech is unequivocal: Transparency isn’t optional; it’s obligatory.
Key DSA Articles Under Scrutiny
Bar lengths represent regulatory emphasis based on CNM’s stated concerns.