The UK government is set to postpone the introduction of controversial provisions in its Online Safety Bill legislation, which would compel tech companies to build back-door access into encrypted messaging services. Ministers had previously insisted such measures were necessary to help law enforcement agencies protect internet users, particularly children, from harmful content, but they have faced opposition from tech companies and campaigners who say it could endanger privacy and pose a cybersecurity risk.
In a statement to the House of Lords later today, the government will reportedly confirm that communications regulator Ofcom will only require companies to introduce back-door access when a technology is developed that is capable of scanning networks in such a manner. The news was first reported by the FT, citing sources briefed on the changes.
A government spokesperson said its position has not changed, but that the power would only be used by Ofcom as “a last resort”.
The Online Safety Bill and end-to-end encryption
The Online Safety Bill has been drawn up primarily to stop children from accessing harmful content by imposing controls on social media platforms and other tech companies around how they assess and delete illegal material.
The legislation has been in the works for a number of years in various guises and was paused last year in the face of opposition from Conservative MPs who felt it would hinder free speech by forcing platforms to suppress “legal but harmful” content. Subsequently, an amended version was agreed between ministers and has passed its first reading in the House of Commons, progressing to the Lords for further scrutiny.
One of the effects of the bill is that companies providing end-to-end encrypted messaging, such as WhatsApp or Signal will be instructed to put methods in place which automatically scan for child sex abuse material (CSAM) so it can be dealt with by police.
It is thought that the only way to do this effectively is through client-side scanning, where companies will scan the contents of a message before it is encrypted to ensure that it contains nothing illegal. Apple tried to introduce this to its iMessage service last year to scan for CSAM, and was forced to withdraw the system almost immediately due to a privacy backlash.
The bill has been opposed by WhatsApp and other encrypted messaging services, which in April wrote an open letter to the government urging it to have a change of heart. It has been suggested WhatsApp could leave the UK altogether should the encryption provisions in the bill come into force.
Speaking to Tech Monitor in April, Prime Minister Rishi Sunak said he backed the legislation in its current form. “I think everyone wants to make sure their privacy is protected online, but people also want to know that law enforcement agencies are able to keep them safe and have reasonable ways to be able to do that, and that’s what we’re trying to do with the Online Safety Bill,” Sunak said.
Does the technology required in the Online Safety Bill exist?
Critics of the bill say the technology required to allow such access to end-to-end encrypted messaging platforms doesn’t even exist.
A government spokesperson told Tech Monitor its position “has not changed”, explaining: “Our stance on tackling child sexual abuse online remains firm, and we have always been clear that the Bill takes a measured, evidence-based approach to doing so.”
The spokesperson added: “As has always been the case, as a last resort, on a case-by-case basis and only when stringent privacy safeguards have been met, it will enable Ofcom to direct companies to either use, or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content – which we know can be developed.”
News of the changes to the Online Safety Bill is likely to be welcomed by the wider IT industry. A poll by BCS, the Chartered Institute of IT, last year found that only 14% of its members consider the legislation fit for purpose, with more than half believing it will not help create a safer internet.
Adam Leon Smith of BCS said: “We have consistently recommended to [the] government that the Online Safety Bill should not put its trust in emerging technology solutions to deliver child protection without rigorous analysis [of] their flaws, evaluation of the privacy trade-off, and a balancing emphasis on education and awareness.
“We welcome revisions being proposed to ensure that accredited technologies will meet minimum standards of accuracy, and that notices to encryption providers will only be issued when technically feasible and when meeting stringent privacy safeguards. We look forward to seeing more detail.”
But Matthew Hodgson, CEO and founder of UK-based encrypted messaging service Element, believes the government is merely “kicking the can down the road”, and that “saying ‘no scanning until it’s technically feasible’ is nonsense”. He explains: “Scanning is fundamentally incompatible with end-to-end encrypted messaging apps. Scanning bypasses the encryption in order to scan, exposing your messages to attackers. “
Hodgson adds: “All ‘until it’s technically feasible’ means is opening the door to scanning in future rather than scanning today. It’s not a change, it’s kicking the can down the road.”