Short answer: it’s a coalition. The push doesn’t come from a single puppet-master but from overlapping interests inside EU institutions, some national governments and police bodies, plus a small but loud set of NGOs and companies that sell scanning/age-verification tech.
Here’s who (and why):
European Commission (DG HOME) & the current Council majority. The original CSAM/“Chat Control” draft was tabled by Commissioner Ylva Johansson in May 2022, and the Council has kept trying to revive stricter versions (e.g., “upload moderation” / client-side scanning) ahead of an expected Council vote in Oct 2025. Motive: expand legal powers to order detection/scanning across platforms, including E2EE services.
Certain national interior ministries and law-enforcement bodies (incl. Europol). They argue they’re “going dark” and want easier “lawful access” (interception, scanning, data retention). Europol papers and “lawful access” roadmaps from the Commission explicitly seek ways around modern security/encryption barriers. Motive: investigative convenience and data visibility.
Child-protection NGOs and allied lobby networks (notably Thorn, IWF). These groups have actively campaigned for mandatory scanning and praised the proposal; reporting has documented their influence in Brussels. Motive: mission-driven framing (combat CSAM) + alignment with vendors that build scanning tech.
Age-assurance / identity vendors (esp. in the UK, but increasingly EU-facing). The UK Online Safety Act has created a market for “highly effective” age checks; firms publicly position themselves as ready suppliers. Motive: regulation-driven demand for ID/face-scan/verification services.
Some platforms and compliance consultancies. A few see pre-encryption screening and age-gating as a way to reduce liability under UK/EU regimes. Motive: risk management and regulatory bargaining. (See industry explainers anticipating “upload moderation.”)
Who’s against it (and why your “child protection” objection has teeth):
EU data-protection authorities (EDPB/EDPS). Their joint opinion warns the draft would lead to generalised, indiscriminate scanning and must not weaken encryption.
European Parliament’s civil-liberties committee & many cryptographers. LIBE pushed back on indiscriminate scanning/E2EE weakening; over 500 crypto researchers signed an open letter calling the scheme unsafe and ineffective. Motive: fundamental rights, security, and high false-positive rates.
Digital-rights groups (EDRi, etc.). They call the latest Council variant “mass surveillance” likely to expand (function creep) beyond CSAM, exactly as you fear.
About the UK piece you mention: the Online Safety Act 2023 is being enforced in phases in 2025, with mandatory age-assurance rolling out and guidance that could pressure messaging services. This isn’t literally “ID to chat,” but it does create real incentives for identity checks and content scanning.
Bottom line
The durable drivers are law-enforcement access and a regulatory market for scanning/age-verification tech. The “protect children” narrative is the sales pitch that makes these measures politically palatable, but the technical community and EU watchdogs keep concluding the proposals would break private communications for everyone while being easy to repurpose for broader surveillance.
If you want to push back effectively (Germany is pivotal right now): contact your MEPs and Germany’s Council representatives before the October Council window; groups like EDRi maintain up-to-date briefings and document pools with specific talking points.