When she first began talking to her peers in the House of Lords about the rights of children on the internet, Baroness Kidron says she looked like “a naysayer”, like someone who was “trying to talk about wooden toys” or, in her husband’s words, like “one middle-aged woman against Silicon Valley”. It was 2012 and the film-maker and recently appointed life peer was working on her documentary InRealLife, spending “hundreds of hours in the bedrooms of children” to discover how the internet affects young lives. What she saw disturbed her.
“I did what they were doing – gaming, falling in love, watching pornography, going to meet-ups, making music – you name it, it happened,” Beeban Kidron says. The film explored everything from children’s exposure to porn, to rampant online bullying, to the way privacy is compromised online. But Kidron noticed that one thing underpinned it all: on the internet, nobody knows you’re a kid. “Digital services and products were treating them as if they were equal,” she says. “The outcome of treating everyone equally is you treat a kid like an adult.”
Almost a decade later, Kidron has pushed through a Children’s Code that hopes to change this landscape for ever. The Age Appropriate Design Code, an amendment to the 2018 Data Protection Act, came into effect this month. It requires online services to “put the best interests of the child first” when designing apps, games, websites and internet-connected toys that are “likely” to be used by kids.
In total, there are 15 standards that companies need to adhere to in order to avoid being fined up to 4% of their global turnover. These include offering “bite-size” terms and conditions for children; giving them “high privacy” by default; turning off geolocation and profiling; and avoiding “nudge techniques” that encourage children to turn off privacy settings. The code, which will be enforced by the Information Commissioner’s Office (ICO), also advises against “using personal data in a way that incentivises children to stay engaged”, such as feeding children a long string of auto-playing videos one after the other.
The code was introduced in September 2020, but offered companies a 12-month transition period, in this time the world’s tech giants have seemingly begun responding to the sting of Kidron’s sling. Instagram now prevents adults from messaging children who don’t follow them on the app, while anyone under 16 who creates an account will have it set to private by default. TikTok has implemented a bedtime for notifications; teens aged 13-15 will no longer be pinged after 9pm. Meanwhile, YouTube has turned off autoplay for users aged 13-17, while Google has blocked the targeted advertising of under-18s.
But hang on, why does TikTok’s bedtime only apply to those 13 and over? Are toddlers OK to use the app until 2am? You’ve just spotted the first flaw in the plan. While social media sites require users to be at least 13 to sign up for their services (in line with America’s 21-year-old Children’s Online Privacy Protection Act), a quick glance at reality shows that kids lie about their age in order to snap, share and status-update. Creating a system in which children can’t lie, by, for example, necessitating that they provide ID to access an online service, ironically risks compromising their privacy further.
“There is nothing that stops us having a very sophisticated age-check mechanism in which you don’t even know the identity of the person, you just know that that they’re 12,” Kidron argues, pointing to a report on age verification that she recently worked on with her organisation 5Rights Foundation, entitled But how do they know it is a child?. Third-party providers, for example, could confirm someone’s identity without passing on the data to tech giants, or capacity testing could allow websites to estimate someone’s age based on whether they can solve a puzzle (no prizes for figuring out the numerous ways that could go wrong).
Whatever the solution, Kidron is currently working on a private member’s bill that sets minimum standards of age assurance, thereby preventing companies from choosing their own “intrusive, heavy handed or just terrible, lousy, and ineffective” techniques.
How did Kidron go from looking like a “naysayer” to changing the landscape so drastically? Kidron began making documentaries in the 80s before working in Hollywood (most notably directing the Bridget Jones sequel The Edge of Reason). After becoming a baroness, she founded the 5Rights Foundation to fight for children’s digital rights. She says she had her “early adopters” in parliament, including the archbishop of York, Stephen Cottrell, Conservative peer Dido Harding and Liberal Democrat peer Timothy Clement-Jones. “That was my gang,” Kidron says, but others remained sceptical for years. “The final set of people only came on board this summer, once they saw what the tech companies were doing.”
The Children’s Code as a whole defines a child as anyone under 18, in line with the United Nations Convention on the Rights of the Child (UNCRC). For Kidron, it’s about much more than privacy – “a child’s right to unfettered access to different points of view is actually taken away by an algorithmic push for a particular point of view,” she argues, also noting that the right to the best possible health is removed when companies store and sell data about children’s mental health. “It’s nothing short of a generational injustice,” she says. “Here was this technology that was purporting to be progressive, but in relation to children it was regressive – it was taking away the existing rights and protections.”
How did these claims go down in Silicon Valley? Conversations with executives were surprisingly “very good and productive”, according to Kidron, but she ultimately realised that change would have to be forced upon tech companies. “They have an awful lot of money to have an awful lot of very clever people say an awful lot of things in an awful lot of spaces. And then nothing happens,” she says. “Anyone who thinks that the talk itself is going to make the change is simply wrong.”
And yet while companies must now comply with the code, even Kidron admits, “they have to comply in ways that they determine”. TikTok’s bedtime, for example, seems both arbitrary and easy to get around (children are well versed in changing the date and time on their devices to proceed in video games). Yet Kidron says the exact o’clock is irrelevant – the policy is about targeting sleeplessness in children, which in turn enables them to succeed at school. “These things seem tiny… but they’re not. They’re about the culture and they’re about how children live.”
As for children working their way around barriers, Kidron notes that transgression is part of childhood, but “you have to allow kids to transgress, you can’t just tell them it’s really normal”. “The problem we have is kids who are eight are looking at hardcore, violent, misogynistic porn and there’s no friction in the system to say, ‘Actually, that’s not yours.’”
Yet problems also arise when we allow tech companies, not parents, to set boundaries for our children. In 2017, YouTube came under fire after its parental controls blocked children from seeing content made by LGBTQ+ creators (YouTube initially apologised for the “confusion” and said only videos that “discuss more sensitive issues” would be restricted in the future). Kidron says she’s “not a big takedown freak” and is “committed to the idea that children have rights to participate”, but can the same be said of companies hoping to avoid fines? Numerous American websites remain inaccessible in Europe after the implementation of General Data Protection Regulation (GDPR) laws in 2018, with companies preferring to restrict access rather than adapt.
For now, it remains to be seen how the Children’s Code will be enforced in practice; Kidron says it’s “the biggest redesign of tech since GDPR”, but in December 2020 a freedom of information request revealed that more than half of GDPR fines issued by the ICO remain unpaid.
Still, Kidron is certain of one thing: that tech companies are “disordering the world” with their algorithms – “making differences of their terms for people who are popular and have a lot of followers versus those who are not” and “labelling things that get attention without really thinking about what that attention is about”. These are prescient remarks: a day after we speak, the Wall Street Journal revealed that Facebook has a program that exempts high-profile users from its rules and has also published internal studies demonstrating that Instagram is harmful to teens. One internal presentation slide read: “We make body image issues worse for one in three teen girls.” Instagram’s head of public policy responded to the report in a blog post, writing: “The story focuses on a limited set of findings and casts them in a negative light.”
Whether or not Kidron was once “one middle-aged woman against Silicon Valley”, today she has global support. The recent changes implemented by social media companies are not just UK-based, but have been rolled out worldwide. Kidron says her code is a Trojan horse, “starting the conversation that says, you can regulate this environment”.
But this Trojan horse is only beginning to open up. “We had 14 Factory Acts in the 19th century on child labour alone,” Kidron says, adding that the code is likely to be the first of many more regulations to come. “I think today we air punch,” she says, when asked how it feels to have led the charge for change. “Tomorrow, we go back to work.”