The UK’s Information Commissioner’s Office (ICO) has imposed a £12.7m fine on video-sharing social media platform TikTok for unlawful collection and use of data on children under 13 years of age. The breaches of the UK General Data Protection Regulation (GDPR) in question took place between May 2018 and July 2020.
The regulator said that TikTok did not do enough to check who was using its platform or take action to remove underage users. It believes up to 1.4 million children under 13 used TikTok in 2020, despite the service having terms and conditions (Ts&Cs) in place that forbid them from creating an account.
Under UK data protection law, online services that use personal data when offering services to under-13s must have consent from parents and carers. The ICO said TikTok took no steps to seek consent, even though it must have been aware there were under-13s using its service.
The regulator’s probe additionally found that TikTok staffers had raised concerns internally with senior managers on this issue, but that these had been ignored. It also found TikTok failed to offer proper information to users about its collection, use and sharing of their data, which meant many users – particularly children – could not have made informed choices about using the platform, and failed to ensure that personal data on UK users was processed lawfully, fairly and transparently.
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws,” said information commissioner John Edwards.
“As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data,” he added. “That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.
“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”
Lower fine than initially proposed
The fine is substantially lower than the £27m the ICO had initially proposed to levy. This accounts for representations from TikTok that meant the regulator chose not to pursue a provisional finding related to unlawful use of special category data – that is to say data on characteristics such as racial and ethnic background, gender identity and sexual orientation, religious beliefs, trade union membership, and health data including biometrics and genetic data.
A spokesperson for TikTok said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.
“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”
Policy changes
TikTok has made a number of changes to its internal policies and practices since 2020, including introducing more tools that enable it to determine when users are lying about their ages, additional moderator training, and options for parents and carers to intervene to get children’s accounts removed.
Alan Calder, CEO of IT governance, risk and compliance practice GRC International Group, said: “This was a fine that was always going to happen – and it has been pretty inevitable ever since the ICO issued its Notice of Intent last autumn. UK GDPR is clear that, under the age of 13, children must have parental consent to sign up to an online platform. That has been the law since May 2018. Compliance was never going to be easy, but that’s not an excuse for ignorance.”
ESET global security advisor Jake Moore added: “This is yet another blow to the social media giant, which has gone to extra lengths to show that it can protect user data. Confidence in TikTok is already lower than they would want, so this will be extra painful. Although the users of the app may be slow to act upon revelations such as this, each hit to the site will damage the brand a little bit more, and individual privacy questions will soon become more apparent among users.
“Anyone using the app should think about what data the app might be collecting on them and decide if the pay-off is worth it.”
More information on protecting children online can be found in a recently published ICO code of practice, which sets out 15 standards that online services should have in place to safeguard children and ensure they have the best possible experience.