EU’s Chat Control Proposal

Table of Contents

The EU’s Chat Control, or Regulation on Child Sexual Abuse Material (CSAM) detection, aims to curb online child abuse through mandatory scanning of private communications. Introduced in May 2022 as “Chat Control 2.0,” it expands on voluntary scanning by services like Gmail and Facebook Messenger. It requires providers to scan chats, messages, emails, and cloud storage, breaking end-to-end encryption on platforms like WhatsApp and Signal. The goal is to boost CSAM detection reports from 1.5 million in 2021 to tens of millions, aiding prosecutions. It also includes network blocking, mandatory age verification ending online anonymity, app store censorship, and barring minors from certain platforms.

Timeline and Political Push

The proposal has faced delays but persists into 2025. After failing under Hungarian and Belgian presidencies in 2024, Denmark revived it on July 1, 2025, targeting adoption by October 14, 2025, with meetings on July 11 and October 14. Belgium’s 2024 consent-based scanning and Poland’s 2025 voluntary approach were rejected over privacy issues. A “blocking minority” stalled progress in December 2024, but shifts like France withdrawing support and Germany’s indecision may sway outcomes. The EU’s ProtectEU strategy, launched June 2025, seeks decryption powers by 2030, signaling broader encryption challenges.

How Chat Control Works

Chat Control uses client-side scanning, where algorithms on devices analyze content before encryption. It applies to all providers offering paid services, using AI to detect known and unknown CSAM and grooming via text analysis. Suspicious content is reported to authorities. High false-positive rates—up to 80% in countries like Switzerland—risk wrongful accusations. Age verification via biometrics or ID checks could end anonymity, while cloud storage scanning (e.g., iCloud) extends to personal files.

Privacy Implications

Chat Control undermines end-to-end encryption, creating vulnerabilities for hackers and governments, threatening journalists, whistleblowers, and abuse victims. It risks self-censorship, stifles free expression, and may criminalize minors’ sexting or exclude them from platforms. High error rates flag innocent content, like family photos. Critics compare it to postal mail scanning, warning of a surveillance state that could expand beyond CSAM.

GDPR and Data Subject Rights Violations

Chat Control conflicts with GDPR principles like data minimization and lawful processing. Blanket scanning processes sensitive data without suspicion, clashing with GDPR’s necessity and proportionality requirements. The European Court of Justice has ruled similar measures violate privacy (Case C-511/18). It breaches GDPR’s fairness and transparency, as users can’t consent meaningfully under coercive “voluntary” models. It threatens GDPR rights to erasure, objection, and transparency, with automated flagging lacking human oversight violating Article 22. Indefinite data retention in CSAM databases could deny erasure rights. These breaches risk penalties up to 4% of global turnover and infringe EU Charter rights to privacy, data protection, and expression.

Opposition and Future Outlook

Over 80% of 2022 public consultations opposed scanning encrypted services. The European Parliament in 2023 favored targeted, judicially supervised scans. Groups like the EFF and Signal threaten EU withdrawal if passed. Public campaigns and lawsuits continue, with Denmark’s push facing a “blocking minority.” Revisions like limiting scans to known content or sparing encryption may emerge by October 2025. The EU must balance child protection with privacy, potentially through enhanced law enforcement tools without mass surveillance, to avoid setting a global precedent for privacy erosion.

Written by: 

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.