UK could force messaging apps to look for child sex abuse images | encryption

Strongly encrypted messaging services like WhatsApp could be forced to use cutting-edge technology to detect child sexual abuse material or face significant fines as a result of new changes to UK digital security legislation.

The Online Safety Act amendment would require technology companies to make best efforts to deploy new technologies that identify and remove child sexual abuse and exploitation (CSAE) content.

It comes as Mark Zuckerberg’s Facebook Messenger and Instagram apps prepare to roll out end-to-end encryption, amid strong opposition from the UK government, which has called the plans “unacceptable”.

Priti Patel, a longtime critic of Zuckerberg’s plans, said the law change balances the need to protect children while ensuring the privacy of online users.

The Home Secretary said: “Sexual abuse of children is a heinous crime. We all need to work to ensure that criminals aren’t allowed to rampage online, and tech companies need to play their part and take responsibility for keeping our children safe.

“Privacy and security are not mutually exclusive – we need both and we can have both, and this amendment delivers that.”

Child safety activists have warned that strong encryption would prevent law enforcement agencies and technology platforms from seeing illegal messages by ensuring only the sender and recipient can see their content – a process dubbed end-to-end encryption is known. However, officials said the change is not an attempt to stop the rollout of more such services and that any technology deployed must be effective and proportionate.

Zuckerberg’s meta business, which also owns encrypted messaging service WhatsApp, is delaying the launch of its Messenger and Instagram plans until 2023.

Screening private messages for child abuse material has proved controversial, with activists warning of negative consequences for user privacy. One controversial method that could be considered by the communications regulator overseeing the implementation of the bill is client-side scanning. Apple has postponed plans to roll out the technology, which would scan user pictures for child sex abuse material before uploading them to the cloud. The company has proposed using a technique that compares photos to known child abuse images when users choose to upload them to the cloud.

Under the proposed change, the Ofcom regulator can require tech companies to deploy or develop new technology that can help find and stop abusive material from spreading. The amendment strengthens an existing clause in the bill, which already gives Ofcom the power to require the use of “accredited technology”. The change now requires companies to use their best efforts to deploy or develop “new” technology when the existing technology is not appropriate for their platform.

If a company fails to use this technology, Ofcom has the power to fine up to £18m or 10% of a company’s global annual turnover, whichever is greater. The Online Safety Bill returns to Parliament next week after consideration by a committee of MPs and is expected to come into force in late or early 2023.

According to the National Crime Agency, there are between 550,000 and 850,000 people in the UK who pose a sexual risk to children. “We need technology companies to be on the front lines with us, and these new measures will ensure that,” said Rob Jones, NCA’s director-general on child sexual abuse.

Leave a Comment

%d bloggers like this: