More than 300 engineers from 33 countries have signed an open letter which asks the European Union to think carefully about how it wants to control the digital exchange of pedophile material: “The effectiveness of the law is based on the existence of effective scanning technologies. Unfortunately, the scanning technologies that exist today and are on the horizon are profoundly flawed”, says the text, which is signed by two Turing laureates (who is considered the Nobel Prize in Computer Science) cryptography experts, Ron Rivest and Martín Hellman. The technical difficulties in the intentions of politicians is something that until now had been left out of the discussion. This letter wants to remedy it.
The engineers admit the laudable objective of limiting child exploitation, but believe that the European initiative can only bring greater problems for the rest of the citizens and turn our mobiles into potential constant spying devices: “It is so important that the technical side appears in the debate like the public finds out what they are preparing in Brussels, like that on tiptoe,” says Carmela Troncoso, a Spanish researcher at the Federal Polytechnic School of Lausanne and one of the promoters of the letter. “It would be like putting a camera in our classrooms to record everything and say: ‘it will only be sent if it is suspicious’”.
As the exchange of encrypted messages works today, the only place where it is viable to analyze its content is on the mobiles of each user. While they circulate they are inaccessible without dismantling the encryption, one of the pillars of the privacy of communications on-line: “Encryption is the only tool we have to protect our data in the digital realm; all other tools have been shown to be compromised,” the letter reads.
The revision of the devices is unfeasible today without adding what, according to the signatories of the letter, would be “espionage software” in the mobile phones of all European citizens. “These tools would apparently work by scanning the content on the user’s device before it has been encrypted or after it has been decrypted, and then report back whenever illicit material is found,” the letter says. “You can equate this to adding video cameras to our homes to listen to every conversation and send reports when we talk about illegal topics.”
“The main intention of the letter is to make it clear that technology is incapable, since it seems that many believe that it is valid,” explains Carmela Troncoso. “We think it’s important that both regulators and the public have all the information about technological limits,” she adds.
From the political side, everything seems much simpler. It would be the service providers (the apps) who should be in charge of finding the criminal material: “The proposed rules will force providers to detect, report and eliminate child sexual abuse material in their services,” says the rapporteur of the proposal, the popular Spanish MEP Javier Zarzalejos. “Providers will need to assess and mitigate the risk of misuse of their services and the measures taken should be commensurate with that risk and subject to robust conditions and safeguards.” From a technical point of view, these demands are unfeasible, the letter says. Apple already abandoned a similar initiative in its devices in 2022 as unfeasible.
“As scientists, we do not expect it to be feasible in the next 10 to 20 years to develop a solution that can run on users’ devices without leaking illegal information and can reliably detect known content, i.e. with an acceptable number of false positives and negatives,” the letter says. That reference to false positives is key.
False positives with erotic images
The way this technology works is by assigning a very long number (hash) to every known image of pedophilia. When there was a match on a device, the warning would go off for the authorities. But this is technically full of holes: it’s trivial to slightly vary a criminal image so that it varies the hashand it is also possible to create legal images with hash criminals to flood the authorities with work and waste their time.
This problem will therefore cause millions of perfectly legal images to end up before the eyes of agents in charge of reviewing these false positives: “At the scale at which private messages are exchanged, even scanning the messages exchanged in the EU in a single app would mean generate millions of errors every day”, the letter clarifies. “That means that by scanning billions of images, videos, text and audio messages per day, the number of false positives will be in the hundreds of millions. Furthermore, it seems likely that many of these false positives are deeply private images, likely intimate and entirely legal, sent between consenting adults.” That sentence clearly implies that the authorities may end up seeing millions of private erotic images, which will have been exchanged by European citizens for their own pleasure.
In the letter, the specialists also foresee two other serious problems: pedophiles would find other sophisticated ways to exchange their material, even further away from the eyes of the authorities, and that this new law would end up further weakening the scant sense of privacy they grant the mobiles. With this software installed on phones, it’s hard not to think that the authorities will seek to take more advantage of it: ”We expect there will be substantial pressure on politicians to expand its reach. First to detect terrorist recruitment, then other criminal activity, and then dissident speech,” the letter says. The less democratic governments would only have to expand the database to hunt other types of content that had nothing to do with child pornography.
You can follow THE COUNTRY Technology in Facebook and Twitter or sign up here to receive our weekly newsletter.
Subscribe to continue reading
Read without limits
#Hundreds #engineers #approve #installing #spyware #mobile #phones