Cressida Dick: Tech giants make it impossible to stop terrorists

The Metropolitan Police Commissioner accused the tech giants of making it harder to track down and stop terrorists. Lady Cressida Dick wrote in the Telegraph on Saturday that the tech giants’ focus on end-to-end encryption has made it “in some cases impossible” for police to do their job.

On Wednesday, Interior Minister Priti Patel launched a new fund for technologies that will keep children safe. She also urged tech companies to put user safety before profit, she said. But cybersecurity experts said they weren’t sure it was possible to build the solutions the government wanted.

Writing on the occasion of the 20th anniversary of the September 11 attacks, Dame Cressida emphasized that advances in communication technologies mean that terrorists can now empower “anyone, anywhere, anytime” via social media and the Internet. The UK, on ​​the other hand, must constantly develop its own digital capabilities to keep up with the terrorists who are using technology to their advantage.

Dame’s message matches that of Ms Patel, who launched the Safety Tech Challenge Fund earlier this week at the G7 Home Affairs Ministers’ meeting. The fund is open to experts from around the world and aims to combat child sexual abuse online. Up to £85,000 will be awarded to five applicants for developing new technologies that enable the detection of Child Sexual Abuse Material (CSAM) online without breaking end-to-end encryption.

End-to-end encryption is a privacy feature that makes it impossible for anyone other than the sender and recipient to read messages sent online. While tech giants like Facebook have said using this kind of technology will protect users’ privacy, many governments, including the US, UK and Australia, have repeatedly challenged the idea since 2019.

Apple plan controversy

Cybersecurity and privacy experts believe Ms. Patel and Dame Cressida’s comments may be in response to Apple’s decision earlier this month to delay its plan to scan iPhones for CSAM.

First announced in August, the detection technology compares images with unique “digital fingerprints” or hashes of known CSAM material from a database at the National Center for Missing and Exploited Children before uploading them to iCloud.

Apple’s technology has been widely criticized by privacy groups and the cybersecurity industry for using its own device to check for a potential criminal, setting a dangerous precedent.

“We already have end-to-end encryption on Apple’s iMessage messaging technology. It is strange that law enforcement and the government have not approached Apple about this. Instead, it’s all about attacking Facebook and WhatsApp.”

Much has been written about the wealth of data technology giants about the users of their services, especially the fact that they constantly monitor user behavior and interests to offer personalized advertisements. He argues that technology companies already have the technology they need to track down pedophiles and terrorists simply by monitoring their behavior. In this regard, he thinks they don’t have to compromise a user’s privacy by viewing all the personal files on their phones.

Muffett, who has more than 30 years of experience in cybersecurity and cryptography, said: “If you have the Facebook account of a middle-aged man randomly texting a dozen young people, you may have suspicious activity. It may be harmless, but it is certainly a topic worth investigating.”

“The UK government is trying to track down CSAM by looking at the content, such as when they are trying to spy, rather than observing the behavior.”

In addition, he says, multiple cybersecurity researchers have tested Apple’s NeuralHash algorithm and found that it mixed two completely different images as the same photo, so they fear Apple will falsely accuse users of criminal content.

Criticism of the new technology fund

A leading cybersecurity expert, who declined to be named, said what the government wanted was not technically possible.

“You can change the law of the land, but you cannot change the law of science.

“There is no way to allow mass scanning of devices without undermining end-to-end encryption protection.” “If someone manages to maintain valid end-to-end encryption while detecting child sexual abuse images, they will make well over £85,000. So I don’t understand what economics is.”

Another cybersecurity chief agrees: “Wherever the government is, Facebook and other sauces, it makes a statement to do more and give them (pedophilia and terrorists) greater reach. “If you read between the lines, Ms. Patel is basically saying they want to hire hackers.”

There are also concerns about privacy. dr. Rachel O’Connell, an online child safety expert and founder of TrustElevate, said, “Can we trust that those in power will not abuse these powers?”

According to data protection expert Pat Walshe, Apple’s solution is illegal. He said he had asked the tech giant to explain how it could be deployed in Europe and has not yet received a response. “The European Court of Justice (ECJ) says that the mobile phone is an extension of our privacy. The courts argue that the device and all information related to it is part of the private sphere of our lives, that is, the mobile phone must be protected within the private sphere. In other words, it requires protection under the European Convention on Human Rights (ECHR).

Mr. Walshe, who leads a team of government and law enforcement agencies at mobile operator Three, is also deeply concerned about the technology financing proposal as it raises too many questions about privacy. Instead, he says, there should be better, more direct reporting channels to enable both citizens and communications providers to report CSAM to tech companies or law enforcement. “And law enforcement needs a huge boost in training, manpower and funding to address the reports,” he said. “I want to put more emphasis on that rather than tearing down the technology that keeps us safe every day.”

Leave a Comment