View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Policy
  2. Privacy and data protection
May 13, 2021updated 15 Aug 2022 11:49am

The UK’s Online Safety Bill could drown internet services in red tape

Proposed legislation targets illegal and 'legal-but-harmful' content, and could leave businesses facing a huge regulatory burden.

By Laurie Clarke

The UK government has unveiled sprawling new legislation that takes aim at online speech on internet services – stretching from illegal to legal yet “harmful” content. The wide-ranging nature of the proposals could leave internet businesses large and small facing a huge bureaucratic burden, and render the bill impractical to implement. 

The Online Safety Bill is a “litmus test” for prime minister Boris Johnson’s commitment to free speech, says the Open Rights Group. (Photo by Dan Kitwood/Getty Images)

The draft Online Safety Bill puts the onus on internet service providers to moderate user-generated content in a way that’s intended to prevent users from being exposed to illegal and harmful content online. It retains the underlying principle of ‘duty of care’ that was first introduced in the Online Harms White Paper in 2019.

Internet services could face fines of up to £18m or 10% of their annual global revenues – whichever is higher – if they flout the new duty of care regime. Broadcasting regulator Ofcom will be given new powers to implement and enforce the regime. The bill is expected to be written into law by the end of the year, although it may have undergone changes by then. 

UK Online Safety Bill: vague definitions could cause problems

In essence, the duty of care regime requires service providers, which includes “user-to-user services” and search engines, to conduct impact assessments, follow codes of practice drafted by Ofcom, and limit the prevalence of serious illegal activity such as terrorist content and child sexual exploitation or abuse content. Services will have different levels of liability, depending on factors like risk assessment, the codes set out by Ofcom, and the prevalence of serious illegal and harmful content on their platforms.

“Ofcom will have to consult a long list of entities to prepare codes of practice, including people affected by harm and human rights experts, which is helpful,” says Edina Harbinja, senior lecturer in media/privacy law at Aston Business School. “However, the secretary of state can directly make modifications to the codes.” Some “category 1” services, which are yet to be fully defined but will probably include the largest social media platforms, will also be tasked with removing “harmful” content that is not necessarily illegal. This means that what’s legal to say out loud could soon become illegal to type in the UK.

What many commentators seem to have missed is the serious compliance challenges and costs for affected businesses.
Jon Baines, Mishcon de Reya

The inclusion of legal but “harmful” content in the bill was something that digital rights experts such as the Open Rights Group have been challenging since the online harms legislation was first proposed. The preservation of the category in the final bill has disappointed many. Critics have also decried the vagueness of the term “harm”, which is defined as when the service provider “has reasonable grounds to believe” that the content could pose a material risk or indirectly cause someone a significant adverse physical or psychological impact. It’s particularly unclear what “indirect harm” could mean, Harbinja points out.

Content from our partners
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Green for go: Transforming trade in the UK

Hefty costs incoming for businesses

The bill differs from draft EU legislation targeted at online content, the Digital Services Act, in that it doesn’t only target the most popular social media platforms. Despite Tuesday’s Queen’s Speech emphasising the importance to the UK of economic growth, the Online Safety Bill would require many technology companies to make massive, costly changes to how they operate, says Jon Baines, senior data protection specialist at law firm Mishcon de Reya. “What many commentators seem to have missed is the serious compliance challenges and costs for affected businesses.”

Ofcom will be able to solicit fees from internet services to fund the vast new regulatory regime. “Given the breadth and range of functions Ofcom would have to undertake, one imagines those fees will be considerable, but they – surely – cannot be allowed to be so high that they discourage business and innovation in the technology sector,” says Baines.

Different rules for different content types

There are exemptions from the legislation for email, MMS and SMS messages, but only if they represent “the only user-generated content enabled by the service”, meaning that Facebook Messenger would still come under the scope of the regulation. Similarly, ‘one-to-one live aural communications’ are exempt, but only if the communications consist solely of speech and don’t include any written message or video, so Zoom will be affected by the legislation too. 

Internal business services, paid-for advertisements, and comments and reviews on provider content are exempt. News publisher content is exempt, but only if it’s a “recognised news publisher”, i.e. on an approved government list. 

That private messaging services such as Facebook Messenger are affected by the legislation has alarmed digital rights activists. “The idea that private messages should be routinely checked and examined is extraordinary,” says Jim Killock, director of the Open Rights Group. “This is a litmus test for the Conservative Party and the prime minister over how committed they are to the principles of free speech.”

In addition to the removal of content, category 1 services will have duties related to rights to freedom of expression and privacy, duties to protect content of “democratic importance”, and duties to protect journalistic content. This means political speech or speech from politicians must be preserved by platforms. “[This] may create issues for content moderation on platforms, plus a question of whether political speech should be distinguished in this way from other important forms of free speech,” says Harbinja. 

Journalistic content should also be protected, the bill stipulates, but again is vague about how this would work. “It seems to cover user content ‘generated for the purpose of journalism’, as long as there is a UK link,” says Harbinja. The government’s press release about the bill notes that citizen journalism should have the same protections as professional journalism, “but in practice, this will be difficult to implement and ascertain if a given user post should be deemed as journalistic in nature and a take-down challenged as such.”

The bill does not refer to human or digital rights, but vaguely mandates the duty about ‘rights to freedom of expression and privacy’, in a section which seems like it has been appended last-minute, says Harbinja. “’A duty to have regard to the importance of’ free speech and privacy, in my interpretation, almost reads like ‘please think of free speech and privacy sometimes’,” she says. 

Will the UK Online Safety Bill succeed?

Many features of the bill are relying on secondary legislation and guides from Ofcom which haven’t been published yet. “It may therefore be some time before the true regulatory burden on companies is clear… In the meantime, the risk of fines – plus Ofcom’s business disruption powers – may lead to regulated companies adopting a cautious approach in applying the new rules,” says Maeve Hanna, partner at law firm Allen & Overy.

The legislation is likely to provoke fierce debate over the coming months. “Red tape and the bureaucratic burden on service providers and Ofcom is going to be massive,” says Harbinja. “It is, therefore, doubtful whether this can ever be implemented in practice, even if we agreed that the substance itself is acceptable.” 

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU