Tech firms cite risk to end-to-end encryption as Online Safety Bill gets royal assent


The government’s controversial Online Safety Bill has become law amid continued concerns from tech companies that it could damage the privacy of encrypted communications.

The Online Safety Act, which aims to make the internet safer for children, received royal assent in Parliament on 26 October 2023.

The act places legal duties on technology companies to prevent and rapidly remove illegal content, such as terrorism and revenge pornography.

It also requires technology companies to protect children from seeing legal but harmful material, including content promoting self-harm, bullying, pornography and eating disorders.

The communications regulator, Ofcom, will have new powers to fine technology companies that fail to comply with the act up to £18m or 10% of their turnover, whichever is greater, which means the biggest tech companies could be fined billions.

The government has estimated that 100,000 online services will come under the Online Safety Act, with the most stringent obligations reserved for “Category 1” services that have the highest reach and pose the highest risk.

Technology secretary Michelle Donelan said the Online Safety Act would ensure online safety for decades to come. “The bill protects free speech, empowers adults and will ensure that platforms remove illegal content,” she said.

End-to-end encryption

But the Online Safety Act, which has taken four years to reach the statute books, continues to raise concerns for technology companies over provisions that could undermine encrypted communications.

Encrypted messaging and email services, including WhatsApp, Signal and Element, have threatened to pull out of the UK if Ofcom requires them to install “accredited technology” to monitor encrypted communications for illegal content.

Section 122 of the act gives Ofcom powers to require technology companies to install systems that they argue would undermine the security and privacy of encrypted services by scanning the content of every message and email to check whether they contain child sexual abuse materials (CSAM).

‘Catastrophic impact’ on privacy

Mathew Hodgson, CEO of Element, a secure communications provider that provides comms services to the Ministry of Defence, the US Navy, Ukraine and Nato, said its customers were demanding guarantees that the company would not implement message scanning if required to do so under the Online Safety Act.

“Some of our larger customers are contractually requiring us to commit to not putting any scanning technology into our apps because it would undermine their privacy, and we are talking about big reputable technology companies here. We are also seeing international companies doubting whether they can trust us as a UK-based tech supplier anymore,” he said.

Speaking on BBC Radio 4, Hodgson said the intentions of the bill were obviously good and that social media companies such as Instagram and Pinterest should be filtering posts for child abuse material.

However, giving Ofcom the power to require blanket surveillance in private messaging apps would “catastrophically reduce safety and privacy for everyone”, he said.

Hodgson said enforcement of Section 122 of the Online Safety Act against technology companies would introduce new vulnerabilities and weaknesses to encrypted communications systems that would be exploited by attackers.

“It is like asking every restaurant owner in the country to bug their restaurant tables – in case criminals eat at the restaurants –  and then holding the restaurant owners responsible and liable for monitoring those bugs,” he said.

The CEO of encrypted mail service Proton, Andy Yen, said that without safeguards to protect end-to-end encryption, the Online Safety Act poses a real threat to privacy.

“The bill gives the government the power to access, collect and read anyone’s private conversations any time they want. No one would tolerate this in the physical world, so why do we in the digital world?” he said.

Writing in a blog post published today (27 October 2023), Yen said while he was reasonably confident that Ofcom would not use its powers to require Proton to monitor the contents of its customers’ emails, he was concerned that the act had been passed with a clause that gives the British government powers to access, collect and read anyone’s private communications.

“The Online Safety Act empowers Ofcom to order encrypted services to use “accredited technology” to look for and take down illegal content. Unfortunately, no such technology currently exists that also protects people’s privacy through encryption. Companies would therefore have to break their own encryption, destroying the security of their own services,” he wrote.

“The criminals would seek out alternative methods to share illegal materials, while the vast majority of law-abiding citizens would suffer the consequences of an internet without privacy and personal data vulnerable to hackers,” he added.

Meridith Whitaker, president of encrypted messaging service Signal, reposted the organisation’s position on X, formerly known as Twitter, that it would withdraw from the UK if it was forced to compromise its encryption.

“Signal will never undermine our privacy promises and the encryption they rely on. Our position remains firm: we will continue to do whatever we can to ensure people in the UK can use Signal. But if the choice came down to being forced to build a backdoor, or leaving, we’d leave,” she wrote.

Zero-tolerance approach

The Online Safety Act takes what the government describes as a “zero-tolerance approach” to protecting children.

It includes measures to require tech companies to introduce age-checking measures on platforms where harmful content to children is published, and requires them to publish risk assessments of the dangers posed to children by their sites.

Tech companies will also be required to provide children and parents with clear ways to report problems, and to offer users options to filter out content they do not want to see.

Ofcom plans phased introduction

The communications regulator plans to introduce the legislation in phases, starting with a consultation process on tackling illegal content from 9 November 2023.

Phase two will address child safety, pornography, and the protection of women and girls, with Ofcom due to publish draft guidance on age verification in December 2023. Draft guidelines on protecting children will follow in spring 2024, with draft guidelines on protecting women and girls following in spring 2025.

Phase three will focus on categorised online services that will be required to meet additional requirements, including producing transparency reports, providing tools for users to control the content they see and preventing fraudulent advertising. Ofcom aims to produce draft guidance in early 2024.

Ofcom’s chief executive, Melanie Dawes, said it would not act as a censor, but would tackle the root causes of online harm. “We will set new standards online, making sure sites and apps are safer by design,” she added.

Advice to tech companies

Lawyer Hayley Brady, partner at UK law firm Herbert Smith Freehills, said technology companies should engage with Ofcom to shape the codes of practice and guidance.

“Companies will have the choice to follow Ofcom’s Codes of Practice or decide upon their own ways of dealing with content. Unless a company has rigorous controls in place, the safe option will be to adhere to Ofcom’s advice,” she said.

Ria Moody, managing associate at law firm Linklaters, said the Online Safety Act tackles the same underlying issues as the European Union’s Digital Services Act (DSA), but in a very different way.

“Many online services are now thinking about how to adapt their DSA compliance processes to meet the requirements of the OSA,” she said.

John Brunning, a partner at law firm Fieldfisher, said the broad scope of the act meant many more businesses would be caught by its previsions than people expected.

“Expect plenty of questions when it comes to trying to implement solutions in practice,” he said.

These include how likely a service is to be accessed by children, whether companies will need to start geo-blocking to prevent people accessing sites that are not targeted at the UK, and where technology companies should draw the line on harmful content.

Franke Everitt, director at Fieldfisher, said online platforms and businesses would not need to take steps to comply immediately. “This is just the beginning of a long process. Government and regulators will need to fill in the detail of what is just a roughly sketched outline of legislation,” she said.



Source link