It may be hard for you to believe that TikTok has only been around for less than four years, writes Jon Belcher, specialist data protection and information governance lawyer at Excello Law.
So much has happened in that time, including the Chinese video-sharing app becoming a ubiquitous part of our lives alongside Facebook and YouTube. Since the COVID-19 pandemic began to spread in earnest in spring 2020, TikTok has enjoyed spectacular growth.
You may be a TikTok user yourself, or perhaps your children have joined an army of mini TikTokers, creating their own 15 second videos with lip-synced music, madcap antics and comic memes to keep themselves amused and to share with friends online during those interminable weeks of lockdown.
Or perhaps you have been concerned by the more alarming TikTok trends popular with some teenagers, such as the fire challenge: spraying hairspray on to a mirror, dimming the lights, and then setting it on fire.
A platform with mass-appeal for minors
If so, you are not alone. According to Ofcom, 44% of eight to 12-year-olds in the UK use TikTok, despite under-13s being forbidden on the platform according to its policies. The app now claims to have in excess of 800 million users worldwide, including over 100 million active US users. Globally, it was the most downloaded app in both 2019 and 2020.
Meanwhile, its Chinese parent company, ByteDance, has recently been valued with an estimated potential market cap of US$400 billion ahead of an anticipated initial public offering (IPO) on the Hong Kong Stock Exchange. If this were to go ahead, it would make the Beijing-based start-up the third-largest Internet giant in China by value after Tencent and Alibaba.
But one potential storm cloud looms on the horizon to thwart this ambition in the shape of a multi-billion-pound claim being brought in the English courts. It concerns alleged widespread breaches of data protection legislation in respect of the personal data of children in the UK and across the European Economic Area.
You might also ask: how did a social media platform for creating, sharing and discovering short videos ever reach this position?
Knowingly effecting widespread data breaches
According to a website launched in support of the claim, https://tiktokdataclaim.uk, the case is being led by Anne Longfield OBE, who was the Children’s Commissioner for England between March 2015 and February 2021. In bringing the case, she claims that TikTok and its parent company, ByteDance, not only collected data illegally from millions of children, but that they did so knowingly, before sharing it with unknown third parties for profit.
Longfield believes more than 3.5 million children in the UK alone could have been affected. With her support, the legal claim was initially filed by an anonymous 12-year-old girl in the High Court last December. In a public statement, Longfield said: ‘TikTok is a hugely popular app, but behind the fun songs, dance challenges and lip-sync trends, lies something far more sinister.’
Be part of something bigger, join the Chartered Institute for IT.
What makes the case against TikTok potentially so significant is that it has been brought as a representative action on behalf of all children under 16 across the EU (or under 13 in the UK), who have used the TikTok app since 25 May 2018, regardless of whether they have an account or their privacy settings.
That date matters because it was when the EU’s General Data Protection Regulation (GDPR) first came into force after a very prominent public information campaign in the months leading up to its introduction. The GDPR strengthens data protection laws across the EU and contains specific rules around children’s data.
Longfield alleges that TikTok does not provide sufficient information about the personal data it collects and does not obtain the necessary consents for its use. It is further suggested that the company is deliberately opaque about who has access to this data.
Collecting facial recognition data
According to Longfield’s case, TikTok uses facial recognition software, which collects data that is provided for use by unknown third parties, such as potential advertisers. The claim alleges that this collection of biometrics and other personal data violates Articles 5, 6, 8, 12, 14, 25, 35 and 44 of the GDPR. These alleged multiple breaches of data protection law include: a lack of transparency, a failure to obtain parental consent when required, and a failure to have a valid lawful basis for its processing of personal data.
From TikTok’s perspective, there could be very significant implications should the courts find against it. First, there is the potentially enormous cost of paying damages to a very large group of individuals, numbered in the millions. Even if the sum awarded by way of damages to each individual only amounted to a relatively modest sum, then the maths would potentially make the aggregate figure in the billions of pounds. The BBC confirmed as much with a headline in April 2021: BBC:‘TikTok sued for billions over use of children’s data.’
Should TikTok lose the case and the judges ultimately find in Longfield’s favour, this could threaten its highly successful business model, as well as cause significant reputational damage to its brand. If the courts were to find that the app had breached data protection laws, that $400bn IPO valuation might start to look overly optimistic.
Tick tock - time’s up for other law-breakers
But you would be right to assume that there are many ifs in this outlook, predicated as it is on what would be a long route to success through the courts, potentially lasting for several years and likely to be fought hard by TikTok. At this juncture, the claim’s prospects of success are, to say the least, uncertain. It is still at a very early stage and much will depend on the outcome of another case, Lloyd v Google, which was recently heard in the Supreme Court.
That case centres on Google’s unlawful collection in 2011/12 of data on health, race, ethnicity, sexuality and finance through Apple’s Safari web browser, even when users had chosen a ‘do not track’ privacy setting. It too has a very large number of potential victims: Mr Lloyd, who brought the case, aims to obtain compensation on behalf of a class of 4.4 million affected users.
Representative actions are relatively uncommon in the UK. A key issue in Lloyd v Google is whether such cases can be brought on behalf of a wide class of individuals in respect of data protection breaches. The Supreme Court has to decide whether Mr Lloyd’s claim can proceed on behalf of the entire class of individuals affected, without requiring them to specifically opt-in. If the Court rules against him, then this is likely to significantly harm the chances of the case against TikTok proceeding. Funded by a third-party litigation funder, the TikTok claim may not be viable unless it can encompass the entire class.
A rise in litigation
In case you did not know, it appears that we are now living in age of data class actions, fuelled by the inexorable growth of litigation funding. Arguably, these actions should deliver accountability and ultimately help to achieve the public policy objective of creating trust amongst users. In turn, this will enable the digital economy to develop for your benefit and that of everyone else - or so the argument goes. But there are reasons to be sceptical, too.
If people were really concerned enough about how digital services and social media companies such as Google and TikTok used their data, these claims could go ahead on the basis of claimants opting in. That these cases are only viable to their funders on behalf of a whole class, the majority of whom seem at best oblivious to the issues, is a real cause for concern.
If the cases against Google and TikTok do prove successful, such cases will become easier to bring and their number will inevitably increase. Either way, the current tech giants and the new digital operators of the future will undoubtedly continue to test the limits of the law to see just how far they can go in fulfilling their objectives through the use (or misuse) of data, which they gather for commercial gain.
Tightening up codes of practice
There are other issues of concern for TikTok, not least a new code of practice for digital services accessible by children which came into force last September. The code, produced by the Information Commissioner’s Office under the Data Protection Act 2018, contains mandatory standards of age-appropriate design. It applies to services provided online and which are likely to be accessed by children, including apps, social media platforms and websites.
Although the new code does not introduce any new legal requirements, businesses that provide online services will need to meet the standards contained within the code in order to comply with data protection laws. And the Online Safety Bill, which is currently before Parliament, looks set to pile additional responsibilities on to social media companies.
In this maelstrom of regulation and litigation, you can be certain of one thing: apps popular with children, such as TikTok, will remain very much in the spotlight.
About the author
Jon Belcher is a specialist data protection and information governance lawyer at Excello Law.