The second reading of the government’s Online Safety Bill is expected to take place today. The bill, designed to help keep citizens safe when using online services, has been targeted at social media companies, but could see 25,100 businesses that fall into its scope spend £250m to avoid legal action by regulator Ofcom.
According to the latest impact assessment on the bill, published on Thursday, micro, small, medium and large businesses that fall into the scope of “user-to-user services and search services” will have to bear a number of costs to adhere to the new regulations. In 2020, DCMS commissioned consultancy firm RR to estimate the number of organisations in scope of the framework and to determine the likely incremental costs of compliance.
In addition, the assessment says 180,000 “platforms” could fall into scope. It is likely they will need to spend between £9.6m and £17.5m to “familiarise” themselves with the new bill.
What is the Online Safety Bill?
The government says the Online Safety Bill is designed to protect children from harmful content such as pornography and limit people’s exposure to illegal content, while protecting freedom of speech.
It will “require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions,” a press release from the Department for Culture, Media and Sport says.
Regulated and actioned by Ofcom, the bill could see companies fined for failing to comply with the laws, with penalties up to 10% of their annual global turnover. The Government says that these companies will have to “improve their practices and block non-compliant sites.”
The bill was first presented to Parliament in May 2021. A joint committee published a report in December 2021, outlining recommendations for the next draft. The latest draft, published this week, also has new offences that make company senior managers criminally liable for destroying evidence as well as failing to attend or provide false information in interviews with Ofcom. It also includes offences for obstructing the regulator when it attempts to enter company offices.
Digital secretary Nadine Dorries, said that tech firms “haven’t been held to account” for the behaviour that happens on their platforms.
“Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age,” she said in a press release.
Further changes to the original draft of the bill include bringing paid-for scam adverts on social media and search engines into scope to combat online fraud, robust checks in place to ensure users are 18 years or over on commercial sites, measures to clamp down on anonymous trolls and making companies proactivity tackle the most harmful illegal content and criminal activity.
Dame Melanie Dawes, chief executive, Ofcom says that today will mark an “important step towards creating a safer life online”.
“Our research shows the need for rules that protect users from serious harm, but which also value the great things about being online, including freedom of expression,” she says.
What powers does the Online Safety Bill give Ofcom?
According to the UK Government’s announcement, the latest draft of the bill would give Ofcom the power to demand information and data from tech companies. This includes the role of their algorithms in selecting and displaying content, so it can assess how they are shielding users from harm.
Ofcom will also be able to enter companies’ premises to access data and equipment, request interviews with company employees and require companies to undergo an external assessment of how they’re keeping users safe.
Additionally, Ofcom can take action against senior managers from companies providing false or incomplete information and obstructing or delaying Ofcom when exercising its powers of entry, audit and inspection.
Falling foul of these offences could result in up to two years imprisonment or a fine, says the announcement.
Ofcom will also need to treat the information gathered from companies “sensitively”. For example, it will not be able to share or publish data without consent unless tightly defined exemptions apply. It will also have a responsibility to ensure its powers are used proportionately.
How will this bill affect UK businesses?
While there has been a big focus on social media platforms, the impact assessment for the Online Safety Bill shows that 25,100 companies will fall into its scope, and will have to adhere to the regulations.
Emma Woollcott, partner and head of the reputation and crisis management team at law firm Michson de Reya, warns that the “tech giants” have been preparing for this bill for years, but the challenge will hit hard for other, smaller businesses.
“The challenge will come for the some 25,100 (or likely more) businesses caught by the new law, which will need to carry out risk assessments to avoid significant fines and potential criminal penalties,” Woollcott warns.
"It appears a range of new criminal offences have been added to the bill, which could find senior managers at such companies criminally liable for destroying evidence, failing to attend or providing false information to Ofcom and/or obstructing the regulator when it enters the company's offices."
Currently, there is no official guidance nor clarity on the criteria for which companies will fall into the regulated company category, beyond stating it will apply to "user-to-user services and search services".
If a company is deemed to be in category one, then it will have to carry risk assessments on the types of legal harms against adults that could arise on their services. It will also have to set out clearly in its terms of service how it will deal with such content and enforce these terms consistently.
The DCMS secretary of state will have the power to add more categories of priority legal but harmful content via secondary legislation should they emerge in the future.
Woollcott adds that even more businesses may be required to take action as the way the bill is applied becomes clearer. "While the government's impact assessment suggests around 25,000 platforms will be in scope of the bill's provisions, that feels like a conservative estimate," she says.
Is metaverse included in the assessment?
However, according to Nigel Cannings, co-founder and CTO of compliance technology vendor Intelligent Voice, these figures could be larger as the assessment doesn’t consider the growth of voice and video.
“The impact assessment seems to ignore the explosion of services that will use and store voice and video, like [gaming messaging app] Discord and new AR and VR services in the metaverse,” he warns.
“These types of platforms have always been shown to allow inappropriate and sexualised behaviour, and while they clearly need to be regulated, the cost of providing moderation on voice and video content is significantly higher than for text-based services.” Throughout the entire assessment, the metaverse is only mentioned once.
Email communication, voice-only calls and short messaging service (SMS)/multimedia messaging service (MMS) will remain outside the scope of legislation, according to the assessment, as well as business-to-customer interaction, which are not considered user-generated content and will also be out of scope.
“An example of this would be a complaints box where users can interact with a business as well as patient-doctor virtual services where users can have a virtual appointment with a physician,” it says.
What is the cost to businesses?
The cost to businesses transitioning to the new regulations is estimated at between £35.5m and £67.4m, the impact assessment says. The estimated total cost to businesses is £250.6m.
A further 180,000 platforms expect to incur costs associated with “familiarisation". These costs will accumulate in the first year of the appraisal period, says the assessment. Full compliance will be expected from 2025.
The initial familiarisation, which is checking whether businesses are in or out of scope, is likely to cost businesses between £3.2m-£8m. “Secondary familiarisation” - in-scope companies which will need to delve more deeply into the regulations - and the dissemination of the information will cost businesses £5.7m and £1.4m respectively, according to the impact assessment's central scenario.
There will also be additional costs to platforms incurred as a result of familiarising themselves with secondary legislation.
“At this stage, it is not possible to predict with any certainty how much material they will have to familiarise themselves with in order to comply,” says the assessment.
Will the Online Safety Bill even keep citizens safe?
Professor Andy Phippen, a fellow of BCS, the Chartered Institute for IT, and a specialist in ethics and digital rights at Bournemouth University, adds: “The rhetoric around the bill seems to be around making children safe, but the entire drive of the proposed legislation is tech sector regulation.
“When I talk to young people about online safety, they usually say they need supportive adults who understand the issues and better education,” he continues. “None have ever demanded that big-tech billionaires need to be brought to heel.
“I recently spoke to a group of young people who were clear that good online safety comes from good education and the opportunity to discuss and ask questions.”