A controversial proposal advanced by the European Union to scan users’ private messages for child sexual abuse material (CSAM) poses serious risks to end-to-end encryption (E2EE), warned Meredith Whittaker, president of the Signal Foundation, which maintains the privacy-focused messaging service of the same name.
The statements of the president of the Signal Foundation on the potential danger of privacy
“Mandating mass scanning of private communications fundamentally undermines encryption. Point“, has declared the president of the Signal Foundation in a statement Monday, adding: “Whether this occurs through tampering, for example, with the random number generation of an encryption algorithm, or implementing a key custody system, or forcing communications to pass through a surveillance system before being encrypted.”
The answer comes as lawmakers in Europe they are proposing regulations to combat CSAM with a new provision called “upload moderation” which allows messages to be scrutinized before encryption.
A recent report from Euractiv has revealed that audio communications are excluded from the scope of the law and that users must consent to this disclosure under the terms and conditions of the service provider.
“Those who do not consent may still use parts of the service that do not involve sending visual content and URLs“, he further reported.
Europol reports on the technology industry
Europol, at the end of April 2024, has guest the tech industry and governments to prioritize public safety, warning that security measures like E2EE could prevent law enforcement from accessing problematic content, reigniting an ongoing debate about balancing privacy with fighting serious crime.
![](https://tech.icrewplay.com/wp-content/uploads/2024/06/Hero@2x-90bf331af1f9fe6fffc024b31061115f19907f0f21f98a3c030f53589a4c08c3-1024x702.jpg)
It also asked platforms to design security systems so that they can still identify and report malicious and illegal activity to law enforcement, without going into implementation details.
A similar episode previously stopped
iPhone maker Apple famously announced plans for implement screening client-side for child sexual abuse material (CSAM), but has abandoned the idea in late 2022 after sustained backlash from privacy and security advocates.
“Scanning one content type, for example, opens the door to mass surveillance and could create a desire to search other encrypted messaging systems across content types“the company said at the time, explaining its decision. He also described the mechanism as a “slippery slope of unintended consequences.”
The conclusion of the CEO of the Signal Foundation
Whittaker, president of the Signal Foundation, also said that calling the approach “upload moderation” is a play on words that is equivalent to inserting a backdoor (or a front door), effectively creating a security vulnerability ready to be exploited by malicious actors and various cyber criminals from other countries.
![](https://tech.icrewplay.com/wp-content/uploads/2024/06/GettyImages-1242977635-1024x768.webp)
“Either end-to-end encryption protects everyone, and establishes security and privacy, or it is broken for everyone“he said, and the president of the Signal Foundation did not hold back at all: “And breaking end-to-end encryption, particularly in a geopolitically volatile time, is a disastrous proposition.”
Not just Signal Foundation, others are also not convinced
The encrypted messaging service Threema also strongly opposed the so-called reads Chat Controlstating that the passage of the law could seriously compromise the privacy and confidentiality of EU citizens and members of civil society.
“No matter how the European Commission is trying to sell it – as ‘client-side scanning,’ ‘upload moderation,’ or ‘AI detection’ – Chat Control is still mass surveillance,” has said the Swiss company. “And regardless of its technical implementation, mass surveillance is always an incredibly bad idea.”
#Signal #Foundation #criticizes #European #Unions #E2EE #proposal