The internet and social media have proven to be a powerful force that, in many ways, seems to be growing out of our control. Those who have shaped the most popular social media platforms have, in some cases, naively created a new source of addiction that can affect our health, both mental and physical. This is especially true for children, whose brains are still forming and are therefore more impressionable and vulnerable.
For this reason, this past Thursday, the UK has implemented a revolutionary code to create “a better internet for children.”
What is the children’s code?
The Information Commissioner’s Office, the UK’s independent data authority, introduced the Age-Appropriate Design Code last year in September, giving companies a year to adjust to the new regulations.
Before the code was introduced, social media, gaming platforms, and video and music-streaming sites were permitted to use and share children’s personal data without regulation, which could potentially cause physical, emotional, and financial harm.
The main concerns are privacy, inappropriate advertising, and the strategies that digital companies use to keep children online for longer periods of time. This includes tactics such as auto-playing videos on a website after one has finished.
How will the code make the internet “better”?
A digital company that targets children now must:
- Design services to be age-appropriate and in children’s best interests.
- Consider whether their use of data keeps them safe from commercial and sexual exploitation.
- Provide a high level of privacy by default.
- Stop using design features that encourage them to provide more data.
- Switch off geo-location services that track where they are based.
- Map what personal data they collect from UK-based children.
There have been complaints that these guidelines are somewhat unclear and leave organizations unsure of what the exact expectations are, so are requesting a more thorough definition of what will be within its scope.
What changes have firms like TikTok and YouTube made?
While the code may still need to be refined, big social media companies have made policy changes that demonstrate that they are taking the code in stride.
For instance, YouTube has turned off default auto-play on videos and stop ad targeting and personalization for all children. TikTok will halt notifications after 21:00 and 22:00 for users between 13 to 15 years of age and 16 to 17 years of age respectively. Instagram will protect children by not allowing adults to message children who do not follow them, defaulting all children’s accounts to private, and requiring users to enter their date of birth to log in.
What penalties are there?
If a company fails to comply with the code, then they will be subject to the same potential penalties as those who breach the General Data Protection Regulation (GDPR), which includes a fine of up to four percent of global turnover.
For now, punishment will not take place during this transition period, and instead, support will be provided to organizations. That said, the ICO retains the power to investigate or audit organizations it believes are intentionally uncompliant.
Companies must provide proof that their services were improved so that they are in line with the code.
“Social-media platforms, video, and music streaming sites and the gaming, industry” will face the most scrutiny writes ICO regulatory futures and innovation executive director Stephen Bonner in a blog post.
Although not mandatory in the US, the US Senate and Congress members have also called on their major tech companies to voluntarily adopt the standards outlined in this code. The Data Protection Commission in the Republic of Ireland is also drafting similar regulations.
How will companies know the age of users?
Most social media sites require users to be at least 13 years of age, but that doesn’t stop much younger users from logging on. How companies choose to determine the age of their users is up to them, but the ICO offers some suggestions for age verification that include:
- Self-declaration.
- Use of artificial intelligence.
- Third-party age verification services.
- Technical measures.
The ICO plans to establish its position on this issue later this fall.