European police chiefs have said the complementary partnership between law enforcement and the tech industry is at risk due to end-to-end encryption (E2EE).
They called on industry and governments to take urgent action to ensure public safety across social media platforms.
“Privacy measures currently being implemented, such as end-to-end encryption, will prevent tech companies from seeing any crime occurring on their platforms,” Europol said.
“Furthermore, it will prevent law enforcement from obtaining and using this evidence in investigations to prevent and prosecute the most serious crimes such as child sexual abuse, human trafficking, drug trafficking, murders, economic crime and terrorist crimes”.
The idea that E2EE protections could hinder law enforcement is often called the “blackout” problem, raising concerns that it creates new obstacles to gathering evidence of nefarious activity.
The development comes in the context of Meta’s implementation of E2EE in Messenger by default for personal calls and one-to-one personal messages starting in December 2023.
The UK’s National Crime Agency (NCA) has since criticized the company’s design choices, which made it harder to protect children from online sexual abuse and undermined their ability to investigate crime and protect the public from serious threats .
“Encryption can be hugely beneficial, protecting users from a range of crimes,” said NCA director general Graeme Biggar. “But major technology companies’ abrupt and increasingly widespread rollout of end-to-end encryption, without sufficient regard for public safety, is putting users in danger.”
Eurool Executive Director Catherine de Bolle noted that technology companies have a social responsibility to develop a safe environment without hindering law enforcement’s ability to gather evidence.
The joint statement also urges the tech industry to build products with cybersecurity in mind, but at the same time provide a mechanism to identify and report harmful and illegal content.
“We do not accept that there needs to be a binary choice between cybersecurity or privacy on the one hand and public safety on the other,” the agencies said.
“Our view is that the technical solutions exist; they simply require flexibility from industry and governments. We recognize that the solutions will be different for each capability and will also differ between platforms.”
Meta, for what it’s worth, already relies on a variety of signals gleaned from unencrypted information and user reports to combat child sexual exploitation on WhatsApp.
Earlier this month, the social media giant also said it is testing a new set of features on Instagram to protect young people from sextortion and intimate image abuse using client-side scanning.
“Nudity Protection uses on-device machine learning to analyze whether an image sent in a direct message on Instagram contains nudity,” Meta said.
“Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta will not have access to these images unless someone chooses to report them to us.”