View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Policy
  2. Privacy and data protection
August 26, 2021updated 31 Aug 2021 6:27pm

Start-up body Coadec ‘baffled’ by UK’s new Children’s Code

The UK's Children's Code is due to be enforced from next week. Critics say its scope is unclear and fails to address the underlying causes of harm to children online.

By Laurie Clarke

The UK will soon make legally binding a set of regulations aimed at protecting children on social media, gaming and streaming platforms. The ICO’s Children’s Code, which will come into force next Thursday following a 12-month grace period, aims to minimise the amount of data companies collect on children, reduce the use of “nudge” techniques that push youngsters to share more data, and implement age checks or ensure a high level of user privacy by default. But Coadec, a group representing UK tech start-ups, says it is “baffled” by the scope of the Code, while others argue it doesn’t go far enough.

The UK’s Children’s Code appears to have already prompted Instagram, YouTube and TikTok to update their policies. (Photo by LAURENCE GRIFFITHS/POOL/AFP via Getty Images)

“Children’s rights must be respected and we expect organisations to prove that children’s best interests are a primary concern,” said Stephen Bonner, executive director at the ICO, the UK regulator that devised the code. “The code gives clarity on how organisations can use children’s data in line with the law, and we want to see organisations committed to protecting children through the development of designs and services in accordance with the code.”

Bonner identified social media, video and music streaming sites, and gaming as the highest risk services for children due to “inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online”. 

The code will only apply to companies that know or believe they process the data of children. Only the highest-risk cases will be closely monitored by the ICO – somewhat easing the regulatory burden on online services that don’t explicitly target kids. Serious breaches of the code could incur a fine of up to 4% of a company’s global annual turnover – the same as under the EU’s GDPR. 

“Companies that do not present a risk to children nor exploit their data will have little to do,” says Baroness Beeban Kidron, chair of 5Rights, the children’s digital rights group. “For those that create risky or extractive services, they will need to address children’s rights and needs.”

The code has likely played a part in forcing some large social media companies to update their policies in the past year. Instagram now prevents adults from messaging under 18s who do not follow them; Youtube has said it will stop ad-targeting and personalisation for under-18s, turn off auto-play on videos, and activate “take a break” and “bedtime” reminders for this age group; TikTok said it will stop sending push notifications to 13-15-year-olds and 16-17-year-olds after 9pm and 10pm respectively. 

“They might not like to say so in public, but recent changes from Google, [Facebook] and Tik Tok are all in response to the Code – it is proof that regulation works,” says Kidron.  “Why else would three global companies make similar announcements in the weeks running up to the 2nd of September?” 

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

The new legislation, which has been hailed as pioneering, has already attracted global interest, noted Bonner. ​​”Members of the US Senate and Congress have called on major US tech and gaming companies to voluntarily adopt the standards in the ICO’s code for children in America,” he said.

What is the scope of the ICO Children’s Code?

But some businesses have raised concerns that the regulation is still not clear enough. “When it comes to the Children’s Code, we are honestly baffled that we are years into this process and it’s still unclear what the ICO wants our community to do,” says Camilla de Coverly Veale, head of tech regulation at the Coalition for a Digital Economy (Coadec). “For example, the ICO has previously said that it will prioritise ‘high-risk’ areas for enforcement – which is great – but we have never had a definitive answer about what the ICO thinks are high-risk. Simply saying ‘social media” isn’t helpful.” 

We are honestly baffled that we are years into this process and it’s still unclear what the ICO wants our community to do.
Camilla de Coverly Veale, Coalition for a Digital Economy

There is also uncertainty over the age verification obligations that might eventually be introduced. The ICO is currently considering how organisations can approach age assurance – whether through age verification or age estimation. The body will formally announce its position on this issue in the autumn. 

The ICO website currently states that businesses can use a range of methods to ascertain age including self-declaration, artificial intelligence, third-party age verification services, account holder confirmation, technical measures and hard identifiers. In many cases, a website will “know” how old its users are thanks to data profiling.

However, small businesses are concerned over potential demands to impose more stringent age checks. “There are profound implications for vast swathes of the ecosystem, particularly e-commerce start-ups,” says de Coverly Veale. “Does the ICO want them to age-gate their services? The ICO has previously said it didn’t expect all online companies to implement an age barrier straightaway [but] we don’t know what that means and if the ICO didn’t want companies to age-gate they shouldn’t have drafted a code that requires it.” 

At the other end of the spectrum, there are concerns that the legislation does not go far enough to address the harms faced by children online. “In relation to harm, the bill focuses on the ‘symptom’ rather than addressing the ‘cause’,” says Jenny Afia, head of the legal team at Schillings law firm. “It doesn’t recognise that content and activities which may seem innocuous can cause serious harm over a period of time. One diet ad, for example, may not be an issue. It’s the habitual promotion of these that is of concern.

“It is also wrong that individuals cannot make complaints under the proposed law,” Afia adds. “Instead, only ‘super-complaints’ are allowed by groups of individuals […] contrary to other types of law.”

Topics in this article :
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU