When Facebook recently mooted a plan to create an Instagram for under-13s, it was met with outcry from lawmakers and child advocacy groups. There is growing pressure on social media platforms to tighten safeguards for children, but a lack of consensus on what this should look like or how it should be enforced. A raft of litigation and newly proposed legislation means that children’s data privacy online is shaping up to be a contentious battleground.
Platforms such as Facebook, Youtube and TikTok do not currently allow users younger than 13, but the sign-up process is easily bypassed by children younger than this. The UK media regulator Ofcom reports that three-quarters of 5-15-year-olds and half of 3-4-year-olds watch Youtube, and 44% of 8-12-year-olds use TikTok.
This means that children’s data is regularly harvested and processed in a way that is indistinguishable from adults’. It can be used to serve adverts, packaged and sold to third parties for marketing purposes, and analysed to serve the user more ‘relevant’ content on the platform. 5Rights, the children’s digital rights group, highlights this last issue as one of the most significant: the platforms’ engagement-boosting algorithms often result in age-inappropriate or extreme material being surfaced to children, it says.
Many concerns about children’s social media usage relate to their ability to understand and consent to how their data will be used. An important facet is “the clash between children’s developing understanding, and capacity to be resilient and digitally literate on platforms, which are quite often opaque and not designed for them […],” says Sonia Livingstone, a professor in the Department of Media and Communications at the London School of Economics and Political Science, who has extensively studied children’s digital rights and online safety.
This highlights the complexities of designing a platform specifically for children. Facebook assured critics that its child Instagram wouldn’t serve ads. But Youtube’s underage service, Youtube Kids, also claims to be ad-free and yet has been accused of “smuggling in” advertising through product placement or sponsored content. Even if a platform doesn’t serve ads to children, the collection of user data can still pose problems.
Platforms aimed at kids could well be investing in the “future market of the child”, says Livingstone, where the idea is to “get the child now, and by the time they’re 13, they and all their friends will be embedded on the platform and ‘pow’, you can blast them with all you’ve got”. An embedded history of online behaviour, photos and other content that can’t be easily erased could be monetised by the site. “All of these things will serve to trap the child by the time they are old enough to perhaps make a different decision,” says Livingstone.
Legal challenges over children’s data privacy
Concerns about children’s data privacy and safety online are becoming more pronounced, and a flurry of lawsuits has been launched against social media companies. In 2019, YouTube reached a $170m settlement with the Federal Trade Commission (FTC) over claims it violated the Children’s Online Privacy Protection Act (COPPA) – a piece of legislation aimed at protecting American children from unscrupulous data practices – by making money from collecting personal data without parental consent.
This has inspired another lawsuit in the UK. Privacy expert Duncan McCann is suing Youtube for allegedly breaching children’s privacy and data rights. The claim contends that Google collects children’s data without parental consent, in breach of laws including the UK Data Protection Act and the EU’s General Data Protection Regulation (GDPR). The class action suit is being brought on behalf of five million children (a rough estimation of the number of children affected), and if successful, Google could owe children and parents more than £2bn.
Following the FTC’s decision to fine TikTok $5.7m for “infractions related to the collection of personal information of users 13 and under”, the UK’s Information Commissioner (ICO) has launched an investigation into the platform for its practices in handling children’s data. The investigation is still ongoing, with the findings due to be published later this year.
TikTok and its parent company ByteDance are also facing a class action suit in the UK, with an anonymous 12-year-old girl as the lead claimant. It alleges that TikTok illegally collects children’s private information in the UK and European Economic Area, including telephone numbers, videos, location data, and biometric data, and has therefore violated the UK Data Protection Act 2018 and Articles 5, 12, 14, 16, 25, 35, and 44 of the GDPR.
Jenny Afia, head of the legal team at Schillings law firm, expects these legal actions to continue to proliferate, in part due to the inadequacy of current legislation governing child users of social media platforms. “I think there will be more challenges as the nature of the conduct is becoming clearer, as we’re all becoming a bit more aware of the risks and what’s actually going on to manipulate children,” she says.
A shifting regulatory landscape
This speaks to the distance between child data rights in theory and practice. “Children have existing protections for their data, in the form of COPPA, GDPR [and] the Children’s Code, but the problem is that when there’s no minimum standards or enforcement for age assurance, companies are under no obligation to treat a child – which they know is a child from profiling – as a child,” says Tony Stower, director of external engagement at 5Rights.
To help address some of these issues, the ICO announced the Children’s Code in September 2020. The code comprises 15 standards of “age-appropriate design”, which applies to any service likely to be accessed by kids. The code includes undertaking a Data Protection Impact Assessment to assess and mitigate risks to children, ‘high privacy default settings for new accounts, and data minimisation. Companies are expected to become compliant by later this year, although the code isn’t legally binding.
Livingstone says that many social media platforms are known to flout GDPR and other data protection and privacy requirements. For example, they process data on the grounds of “legitimate interest”, which several data protection authorities in Europe have said is inappropriate and must be explicitly consented to. “Whether the ICO will go after Facebook, Instagram… in relation to the age-appropriate design code is a really interesting case,” she says.
In the US, senators Edward J. Markey (D-Mass.) and Bill Cassidy (R-La.) have proposed updating COPPA into the Children and Teens’ Online Privacy Protection Act. This would extend the protections of the COPPA to teenagers: “Prohibiting internet companies from collecting personal information from anyone 13-15 years old without the user’s consent; creating an online “Eraser Button” by requiring companies to permit users to eliminate personal information from a child or teen; and implementing a “Digital Marketing Bill of Rights for Minors” that limits the collection of personal information from teens”.
In the UK, the newly proposed Online Safety Bill also aims to make the internet safer for children. Although the bill has been criticised for threatening civil liberties, it has been hailed as a welcome step by children’s charities such as the NSPCC. But Livingstone says the bill isn’t a silver bullet. “It’s setting the bar quite high in terms of addressing the most egregious harms, but the ones that many people worry about for young children are absolutely not on the agenda,” she says.
Although it will target extreme harms such as pornography and racist abuse, “it won’t address the fact that there’s a relentless promotion of women who are thin and beautiful and blonde [and] the fact that many of those images of women come along with adverts for cosmetic surgery […]”, Livingstone points out.
Taking the “four Cs” of online risk – content, contact, conduct and contract – the Online Safety Bill only addresses the first C, “content”, says Schillings’ Afia. “It’s not looking at the whole ecosystem behind that content: the way social media platforms are created and the algorithms – the persuasive design features that make us all addicted to our screens – [or] the way products and financial transactions are inherent to so many of the platforms, including Instagram.”
A better internet for children?
To date, a global coalition of more than 99 children’s advocacy groups and expert individuals have urged Facebook to scrap its plans for a kids’ Instagram, citing concerns about children’s data privacy, safety and physical and mental health. In the US, more than 40 attorneys general have signed a letter imploring Facebook to halt.
“[Instagram’s] relentless focus on appearance, self-presentation, and branding presents challenges to children’s privacy and wellbeing,” says Stower. “Rather than building a system which prioritises children’s best interests, Instagram Junior looks like it is more about hooking kids to the Facebook platform at an even younger age: collecting more data, encouraging children to share more of their lives online and putting them at risk from groomers and others who would do them harm.”
But while many are sceptical of Facebook’s plans based on its track record, this doesn’t mean a better social platform for kids isn’t possible. Children are going to continue using the internet in ever greater numbers, so it’s in their best interests to design platforms that genuinely cater to their needs, says Livingstone.
She says looking solely at the risks without balancing these against a child’s right to a positive online experience leads to “a highly protectionist world”. But the way platforms are designed, “in which more contact is always better than fewer, in which more extreme content is always more engaging, in which the whole mechanism of the platform is to push you to spend more time and spread yourself further and engage more… is at the heart of the problem here,” says Livingstone.
She says instead of the global stage that the likes of Instagram represent, what children want is closer to an online village, where one can wander over to see one’s friends, grandparents and other family members. She uses the example of a child sharing a picture of their cat on a site like Instagram: “That innocent and positive desire would trigger an algorithmic process that says, ‘wouldn’t you like to share your cat with more people’ or ‘wouldn’t you like to see more pictures of cats’ or ‘maybe you could make money if lots of people like your cat’. All of those mechanisms are absolutely not what children are calling for.”