The UK's bold quest to protect kids online raises lots of questions

Regulators argue children are using an internet that simply isn't designed for them – and now they're taking action
Getty Images / WIRED

It’s not just what kids can see on the internet that’s cause for regulatory consternation, it’s also what the internet can see about them.

And so, in the wake of GDPR, the UK’s Information Commissioner’s Office has laid down a new set of rules that will require digital services to strengthen their privacy settings for users under the age of 18.

In simple terms, the Age Appropriate Design Code, details new standards for how everything from social media platforms to internet connected toys should collect and process data about children.

Information commissioner Elizabeth Denham says that while one out of five UK internet users is under 18, “they are using an internet that was not designed for them”. She says that the children her team met during the research for this code described data collection practices as “rude”, “nosy” and “freaky”. The code is intended to put an end to that, by demanding that internet companies use some extra care in gathering and handling minor’s data.

According to a spokesperson for the Department for Digital, Culture, Media and Sport, the code will be an “important part” of ongoing work to make the UK the “safest place in the world to be online”. But will it and, more importantly, what does that even mean?

The ICO prepared this statutory code of practice on the grounds of the Data Protection Act 2018 – a law approved in May 2018 which is often described as a complement to the European Union’s GDPR. Section 123 of the Data Protection Act required that the ICO prepare new standards for age appropriate design for digital products and services used by children.

The EU law requires that member states notify the European Commission about any new technical regulations, which means that the UK government will have to do just that. The EU Commission’s assessment period in similar cases usually lasts around three months. Following that phase, the DCMS Secretary Nicky Morgan will lay the code before parliament – essentially, it will be made available for MPs to read and possibly object to.

If no objection is made in the 40 sitting days after the code has been laid, the regulation will be issued, and finally come into force 21 days later. Businesses will, however, be granted a “transition period” of 12 months to adapt to the new regulatory framework. Practically, this means the code won’t have any tangible impact until autumn 2021. At that point, companies that do not comply with the code will face sanctions, and fines of up to four per cent of their global turnover.

And, fret not, Brexit won’t have any impact. According to the ICO, the code will still apply regardless of the eventual outcome of Brexit and post-Brexit negotiations – given that a UK version of GDPR will become part of British law.

The ICO’s Code establishes 15 standards that all companies will have to abide by when designing products that might be used by children. The code’s declared aim is minimising the amount of data that companies harvest from users who are under 18, and giving children and their parents more control over the quantity and type of data collected as they use certain services or devices.

In more detail, that will mean that each service or app should adopt “high privacy” settings by default, rather than requiring individual users to switch them on themselves. Options that use data for profiling purposes (for instance, to target the content shown to each user) should also be switched off by default; geolocation should be switched off by default – and even if it is activated it should roll back to “off” after every session.

The code’s fourth standard – on transparency – prescribes that privacy information and other company policies about data use be explained with a clear and direct language that children can grasp. The ICO even suggests using “diagrams, cartoons, graphics, video and audio content, and gamified or interactive content that will attract and interest children, rather than relying solely on written communications.” We might be in for an era of infantilised, if less obtuse, terms and conditions.

One standard that is bound to attract controversy (more on this later) dictates that each service providers should either establish with confidence the age of the users – so that they can treat them in accordance to the ICO’s standards – or apply the same high-privacy default approach when dealing with all their users.

But working out which companies are supposed to to comply with the code is tricky. According to ICO’s document, “[t]his code applies to ‘information society services likely to be accessed by children’ in the UK”. This includes apps, programs, connected toys, search engines, social media platforms, streaming services, online games, news and educational websites and other websites offering goods and services. But, importantly, it is not restricted to services specifically directed at children. Simply put, barring websites that enforce pretty strong age verification measures, a vast majority of websites and services will need to comply with the standards.

The ICO suggests that service providers justify their decision not to adopt the code by producing documents showing that children are not likely to use their service. These documents can include “market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures,” according to the ICO. Of course, if evidence emerges that children are indeed starting to use a certain service, the service will be required to start enforcing the age appropriate standards.

So how are companies supposed to ascertain a user’s age? The ICO is method-agnostic, listing a series of techniques including self-declaration (for low-risk websites), AI-enabled age estimate, third-party verification services, up to requesting that every user submit a copy of their ID or passports. Alternatively, rather than tailoring their services’ settings by age cohort, companies might just decide to enforce the same age appropriate standards for all their users.

“[This] doesn’t mean that you have to ignore any information you do have about the user’s age, or that adult users have to be infantilised. It just means that all users will receive some basic protections in how their personal data is used by default,” the ICO’s document reads.

Perhaps unsurprisingly, some in the technology industry has been lukewarm to the changes. The Coalition for a Digital Economy (Coadec), a lobbying group for British technology startups, raged against the ICO’s code, calling it “a textbook example of bad regulation that will entrench big companies.”

Coadec alleges that the code will require technology companies to design multiple versions of the same product, and that such a burden is eventually going to favour the tech giants who have the staff and the capital to devote to this fine-tuning effort.

The organisation also suggests that companies will have to collect more data on their users than they do now, in order to carry out age verification controls. On that point, the ICO document underlined that companies gathering additional data to verify a user’s age should not use that data for anything other than age verification.

“There has been no impact assessment on the costs companies will incur from these ill-considered plans to torpedo startup business models,” Coadec said in a statement.

This article was originally published by WIRED UK