British regulators are demanding that 11 social media and video-sharing platforms bolster their protections for children’s privacy. The move follows a comprehensive review of 34 platforms that revealed widespread shortcomings in safeguarding young users.
The Information Commissioner Office in the UK is stepping up enforcement against companies that fail to comply with the Children’s Code, a regulatory framework designed to protect minors online. Eleven platforms are facing scrutiny over default privacy settings, geolocation data, and age verification measures.
Children’s Privacy Paramount
“Online services catering to children must prioritize privacy,” said Deputy Commissioner Emily Keaney. “We won’t tolerate companies that put young people at risk of harm.”
“There is no excuse for online services likely to be accessed by children to have poor privacy practices. Where organisations fail to protect children’s personal information, we will step in and take action.” – Emily Keaney, deputy information commissioner
The regulator is also investigating targeted advertising practices aimed at children, seeking to align industry behavior with both the Children’s Code and broader data protection laws.
In a bid to gain deeper insights into how social media impacts children’s privacy, the office is launching a call for evidence. They focus on two areas:
- How children’s personal information is currently being used in recommender systems (algorithms that use people’s details to learn their interests and preferences in order to deliver content to them); and
- Recent developments in the use of age assurance to identify children under 13 years old.
Researchers, industry stakeholders, and civil society organizations are encouraged to contribute their expertise on recommender systems and age assurance technologies.
The findings from this research will inform future regulatory actions to strengthen child protections.
The tech industry has undergone significant changes in response to the Children’s Code, but the regulator emphasizes the ongoing need for vigilance.
“Our world-leading Children’s Code has made a tangible difference in protecting children from targeted advertising,” Keaney added. “But we must continue to push for improvements to ensure a safer online environment for young people.”
The ICO did not immediately respond to The Cyber Express’ request for response on who these 11 platforms are, how much time do they have to respond to the notice and what if they would be fined if repeatedly found guilty.
The latest warning to social media and video-streaming platform comes after last year, the ICO fined TikTok £12.7 million for multiple breaches of data protection laws, which included allowing over one million children under 13 to use its platform without parental consent in 2020, contrary to its own terms of service, at the time.
The U.S. has also taken children’s privacy seriously and reprimanded Meta for misleading parents about its children’s data privacy practices. However, Meta vowed to fight the allegations “vigorously” deeming it a “political stunt.”