Will the EU’s Chat Control legislation undermine encryption and privacy?

Digital freedom
14 mins

The European Union (EU) is moving closer to passing new legislation aimed at combating child sexual abuse material (CSAM). Known as “Chat Control 2.0,” this proposed law would require the bulk scanning of digital messages, including those on encrypted platforms like WhatsApp and Signal

Proponents argue that these measures are essential for protecting children, but the proposal has sparked significant controversy. Privacy advocates, tech companies, and even some members of the European Parliament are raising alarms. They believe the legislation could undermine end-to-end encryption and lead to mass surveillance.

Critics warn that, despite good intentions, the legislation could introduce vulnerabilities in encrypted messaging systems, making them susceptible to hacking and misuse, ultimately compromising the privacy and security of millions of users. Although the EU Council was set to vote on the proposal on June 20, 2024, the decision was postponed due to significant opposition. The proposal is not off the table and could still move forward after further negotiations. The debate over Chat Control 2.0 continues to highlight the ongoing struggle to balance child safety with the fundamental right to privacy.

Below, we’ll explore the implications of the proposed Chat Control legislation, the arguments from both supporters and opponents, and what it means for the future of privacy and security in Europe.

Jump to…
What is the Chat Control legislation?
How does Chat Control work?
Which platforms are impacted?
Why is the legislation controversial?
Supporters of the Chat Control legislation
Opposition to Chat Control
Legal and political landscape
So, what happens now?
Finding a middle ground

What is the Chat Control legislation?

Chat Control 2.0 represents the EU’s latest effort to combat the spread of CSAM online. Introduced in 2022 by European Commissioner for Home Affairs Ylva Johansson, the legislation aims to establish a comprehensive framework for detecting and preventing the distribution of CSAM across digital communication platforms.

At its core, the Chat Control legislation seeks to implement an “upload moderation” system, which would require digital messaging services to scan all shared content, including photos, videos, and links, against a government database of known abuse material. This scanning would occur even on encrypted messaging platforms such as WhatsApp, raising significant concerns about the potential impact of end-to-end encryption.

The key objectives behind the Chat Control legislation are to enhance the detection of CSAM, facilitate quicker response times from law enforcement, and ultimately protect vulnerable children from exploitation. EU officials argue that with the rise of encrypted messaging and AI-powered image generation software, there is an urgent need for stronger measures to prevent these platforms from becoming safe havens for illegal content.

How does Chat Control work?

Upload moderation

The proposed Chat Control legislation wants digital messaging services to set up an upload moderation system. All shared content—photos, videos, links—would be scanned before being sent to the recipient. These scans would compare the content against a government database of known CSAM, using AI-powered algorithms to spot potential matches. If something suspicious pops up, it gets flagged for further review by human moderators and, if needed, reported to law enforcement. During this review process, the message would be held back and not delivered to the intended recipient until it has been cleared as safe.

Scanning encrypted messages

One of the biggest points of contention is the requirement to scan messages on end-to-end encrypted platforms. End-to-end encryption ensures that only the sender and recipient can read the messages, keeping them safe from prying eyes. However, the proposed law suggests that messages should be scanned before they’re encrypted and sent. The content would be checked against the CSAM database at the point of upload, potentially undermining the privacy end-to-end encryption provides.

User consent

To address privacy concerns, the legislation includes a provision for user consent. Users would need to agree to have their messages scanned. If they don’t consent, they won’t be able to upload or share photos, videos, and links on the platform. This opt-in requirement aims to give users some control over their data but also raises questions about how effective and practical this system would be. Additionally, it could influence user behavior, leading some users to switch to services that don’t implement this type of scanning, affecting which platforms they choose to use or avoid.

Which platforms are impacted?

Chat Control 2.0 is aimed at a wide range of encrypted digital communication platforms, including popular messaging services like WhatsApp, Signal, and Telegram. These services have gained a strong following because of their robust encryption that keeps messages private between the sender and recipient.

Social media platforms like Facebook Messenger and Instagram, which have direct messaging features, are also in the spotlight. Even though these platforms aren’t solely designed for secure messaging, they still raise significant concerns due to their massive user bases and the private nature of their messaging.

Email services like Proton Mail and other lesser-known encrypted communication tools aren’t off the hook either. Any platform that enables the exchange of digital messages and allows multimedia sharing would need to adopt the “upload moderation” system, scanning for known CSAM before the content gets encrypted and sent.

Why is the legislation controversial?

Clash with end-to-end encryption

The Chat Control legislation has stirred up quite a storm, mainly because it clashes with the principles of end-to-end encryption and the fundamental right to privacy. End-to-end encryption is a cornerstone of digital security, ensuring only the sender and recipient can read a message. By proposing to scan messages before they’re encrypted, the legislation introduces what many critics call a “backdoor” into secure communication systems.

Privacy advocates argue that this move undermines the very essence of encryption, leaving messages open to potential breaches and unauthorized access. They warn that any access point for scanning could be exploited by malicious actors, including hackers and hostile governments, compromising the security of millions of users.

Mass surveillance concerns

The legislation has also been likened to mass surveillance, a comparison that raises alarms among those worried about civil liberties. Critics say scanning every message for illicit content amounts to indiscriminate surveillance, infringing on the privacy of all users regardless of any suspicion of wrongdoing. This brings up serious ethical and legal questions about balancing security measures with individual rights.

Potential misuse of technology 

Another big issue is the potential misuse of the scanning technology. While the main goal is to detect and prevent the spread of CSAM, there’s a fear that once established, the same technology could be used for broader surveillance. This could lead to a slippery slope where the scope of the legislation expands to monitor other types of content, such as political speech, dissent, or other legally protected activities, further eroding privacy protections.

AI inaccuracies and false positives 

Tech companies and digital rights groups also highlight the practical challenges and flaws of the proposed measures. The reliance on AI for content scanning raises significant concerns. AI algorithms, though powerful, aren’t perfect. They can make errors, resulting in false positives—innocent content mistakenly flagged as abusive. These inaccuracies can have severe consequences for users, such as wrongful scrutiny, legal issues, and personal distress.

And this problem isn’t just theoretical. Real-world examples show the risks. In 2022, Google’s AI tool wrongly flagged a photo of a child’s medical condition as abusive, leading to a police investigation, and the wrongful termination of the father’s Google accounts. Incidents like this highlight how AI mistakes, when scaled to a legislative mandate, could impact thousands or even millions of users.

Power and accountability concerns 

Furthermore, the legislation places a lot of power in the hands of private companies tasked with implementing the scanning systems. This delegation of responsibility to corporations, many of which are based outside the EU, raises questions about accountability and oversight. Critics worry this could lead to inconsistent enforcement and potential abuses of power by the companies involved.

Who supports the Chat Control legislation?

Despite the controversies, there are strong voices in favor of the Chat Control legislation. Supporters, including EU officials, law enforcement agencies, and child protection advocates, argue that these measures are essential for stopping the spread of CSAM and protecting vulnerable children across Europe.

EU officials

Leading the charge is Johansson, who first introduced the proposals in 2022. The Swedish politician and other EU officials argue these new measures are essential for tackling the growing threat of online child exploitation. They point out that while encrypted platforms are great for privacy, they’ve also become safe havens for those committing serious crimes. By implementing the “upload moderation” system, they aim to catch and stop CSAM before it spreads.

Law enforcement agencies

Law enforcement agencies across Europe are also big supporters of Chat Control 2.0. They highlight the difficulties faced in tracking and prosecuting online child abuse without access to necessary digital evidence. These agencies argue current encryption technologies make it hard to effectively investigate and prevent CSAM-related crimes. They believe the proposed legislation would provide the tools needed to combat child exploitation more efficiently and protect at-risk children.

Child protection advocates

Advocates for child protection measures, including various NGOs and advocacy groups, also back the legislation. They reference alarming statistics from organizations like the UK-based Internet Watch Foundation (IWF), which reported that 66% of the 32 million cases of child exploitation material in 2022 came from the EU. These advocates argue the legislation is a necessary compromise to address a pressing issue, asserting that the benefits of safeguarding children far outweigh the privacy concerns raised by opponents.

Government support

Belgium, currently heading the Council of the EU, played a key role in advancing the latest version of the legislation. Officials argue the proposals strike a balance between privacy and security, limiting scans to photos, videos, and URLs while requiring user consent. They believe this is a reasonable approach to ensure encrypted platforms don’t become breeding grounds for CSAM.

Opposition to Chat Control

Chat Control 2.0 has sparked a strong backlash. Key players in digital rights, tech, and privacy circles are speaking out against it, arguing that it jeopardizes essential freedoms and security measures.

Privacy advocates and digital rights organizations

Leading digital rights groups, like the Electronic Frontier Foundation (EFF), European Digital Rights (EDRi), and the Internet Freedom Foundation, have been vocal about their opposition. They argue the legislation introduces a form of mass surveillance, comparing it to Orwellian oversight that infringes on civil liberties. These organizations are concerned that scanning all digital messages, regardless of any suspicion of illegal activity, sets a dangerous precedent that could erode privacy rights across the EU.

Tech companies

Major tech companies, especially those offering encrypted messaging services, are also strongly against the legislation. Signal, WhatsApp, and ProtonMail have all raised serious concerns about the impact on end-to-end encryption. Signal’s president, Meredith Whittaker, has been particularly outspoken, stating the legislation’s requirement for message scanning, even before encryption, fundamentally compromises the security end-to-end encryption is supposed to provide.

Whittaker has called the legislation “the same old surveillance with new branding,” highlighting that terms like “upload moderation” are just euphemisms for creating backdoors in encrypted systems. She warns that these backdoors, no matter how they’re framed, would introduce vulnerabilities that could be exploited by hackers and hostile nation-states, effectively dismantling the protections offered by end-to-end encryption. Signal has even threatened to cease operations in the EU if the law passes, underscoring the seriousness of the issue.

European Parliament members

Several members of the European Parliament (MEPs) have also voiced their concerns. Patrick Breyer, a member of Germany’s Pirate Party, is a leading critic of the legislation. He argues the proposed measures are equivalent to installing government spyware on every device, which he believes is an extreme form of surveillance not seen in any other democratic nation. Breyer and other MEPs insist that such sweeping surveillance measures should not be implemented without robust legal safeguards and targeted surveillance based on probable cause.

Public and expert opinions

The general public and various cybersecurity and digital rights experts have also expressed opposition. A poll conducted by the European Digital Rights (EDRi) group showed a significant majority of young people in the EU are against the legislation, favoring privacy over the proposed security measures. Experts in cryptography, like Matthew Green from Johns Hopkins University, have warned the proposed scanning systems could introduce significant vulnerabilities, making encrypted messaging systems more susceptible to attacks.

Legal and political landscape

How close is Chat Control 2.0 to being signed into EU law? Since it was introduced in 2022, the legislation has been a lightning rod for debate and controversy. Fast forward to mid-2024, and the proposal is still stuck in a quagmire of resistance and delays. When it seemed like a vote might finally happen on June 20, the EU Council hit pause. The Belgian Presidency admitted that member states needed “more negotiations” because of the strong opposition against the piece of legislation.

Where do member states stand?

The member states are pretty divided:

  • Against: Germany, Luxembourg, the Netherlands, Austria, and Poland have firmly said no. They’re worried about privacy and the risk of mass surveillance. Recently, Sweden has also been adding its voice to the opposition.
  • For: Belgium has been pushing hard for the legislation, arguing that it strikes a good balance between protecting children and preserving privacy.

Despite the vocal opposition, it’s not enough to completely shut down the legislation. For the proposal to move forward, a qualified majority is needed: at least 55% of member states, representing at least 65% of the EU population, have to vote yes. This complex voting system makes the whole situation even more uncertain.

Legal hurdles ahead

But if Chat Control 2.0 does manage to get past the voting stage, it’s not to say it will become EU law. Privacy advocates and digital rights groups are ready to challenge it in court. The European Court of Justice, which has a history of pushing back against mass surveillance, will also end up likely having a big say.

So, what happens next?

Further negotiations

The Belgian Presidency has said they will keep talking to find a way forward. They’ve indicated that consultations will continue in a “serene atmosphere” to find common ground among member states. They’re hoping to get everyone on the same page and start negotiations with the European Parliament.

Potential amendments

Given all the pushback, it’s likely that the proposal will be further amended before it’s reconsidered. Potential changes might include stronger protections for end-to-end encryption, clearer definitions of what content will be scanned, and better ways for users to consent.

Timeline for enactment

As for when all this might finally be resolved, it’s anyone’s guess. If a revised vote can happen and the legislation gets through, the next step is the trilogue phase. This is where the European Parliament, the Council, and the Commission work together to hammer out the final details. This could take several months, so we might be looking at late 2024 or even early 2025 before anything is set in stone.

Continued opposition

Even if it does pass, expect ongoing battles. Digital rights groups, privacy advocates, and tech companies aren’t going to give up without a fight. Legal challenges could slow things down even more or force further changes to the legislation.

Finding a middle ground

The Chat Control 2.0 debate highlights a tough issue: how do we protect kids online without giving up digital privacy and security? Finding this balance means thinking about ethics, technology, and social impacts.

Ethically, there’s no doubt we need to protect children from abuse and exploitation. Supporters say strict measures are necessary to keep digital platforms from being used for these crimes. But, we also need to respect personal privacy. Scanning messages can feel like a big invasion of this fundamental right.

From a tech perspective, merging content scanning with end-to-end encryption is tricky. End-to-end encryption is designed to keep messages secure between sender and recipient. Scanning messages, even before encryption, can create security holes that bad actors might exploit.

Socially, the implications are huge. Setting up systems to scan private messages could lead to more extensive surveillance and censorship. Once the technology is in place, it could be used for things beyond its original purpose, like monitoring political dissent or other unwanted speech. This could seriously harm civil liberties.

To find a balance, we need smart solutions to protect kids and general privacy. This could mean developing advanced AI that can spot harmful content without invading privacy or using targeted surveillance with strict oversight.

Most importantly, we need open conversations among everyone involved—privacy advocates, tech companies, child protection groups, legal experts, and policymakers. These discussions should be transparent and based on solid evidence to find ways to keep children safe while also protecting our digital freedoms.

Do you think Chat Control 2.0 is a necessary step to protect children, or does it pose too great a risk to our privacy? Let us know in the comments below. 

Phone protected by ExpressVPN.
Privacy should be a choice. Choose ExpressVPN.

30-day money-back guarantee

A phone with a padlock.
We take your privacy seriously. Try ExpressVPN risk-free.
What is a VPN?
I like hashtags because they look like waffles, my puns intended, and watching videos of unusual animal friendships. Not necessarily in that order.