Inquiries directed at tech giant regarding allegations that underage users can stay on the platform by asserting parental oversight
TikTok faces scrutiny regarding child user protections, as The Guardian’s investigation reveals moderators allowing under-13s to stay if claiming parental supervision. In a documented case, a 12-year-old, below TikTok’s age threshold, remained due to parental account management, prompting a moderator to seek guidance from a quality analyst in autumn.
The guidance from the TikTok quality analyst indicated that if the account bio mentioned parental management, moderators were allowed to let the account remain on the platform. This directive was shared in a group chat with over 70 moderators overseeing content primarily from Europe, the Middle East, and Africa.
Additionally, there are allegations that moderators have been informed during meetings that accounts featuring a parent in the background of an ostensibly underage video, or those with bios indicating parental management, can be retained on the platform.
Instances suspected of underage account ownership are directed to an “underage” queue for further review. Moderators are faced with two choices: banning, resulting in the account’s removal, or approving, permitting the account to persist on the platform.
According to a TikTok staff member, evading a ban for underage users is deemed “incredibly easy.” The individual expressed concern that once children discover this strategy, they are likely to share it with their peers.
TikTok responded, refuting the assertion that children under 13 are permitted on the platform by simply stating in their bio that the account is supervised by an adult. A spokesperson stated, “These claims regarding TikTok’s policies are inaccurate or stem from misunderstandings. The Guardian has not provided sufficient details about their other allegations for us to investigate. Our community guidelines are universally applicable to all TikTok content, and we strictly prohibit users under the age of 13 on our platform.
On its website, TikTok asserts a strong commitment to ensuring a safe and positive experience for users under the age of 18. Emphasizing the necessity of being 13 years or older to have an account, the platform highlights a compulsory age gate for all users during the sign-up process. TikTok reports that, in the span of April to June of the current year alone, it eliminated over 18 million suspected underage accounts globally.
However, TikTok has faced regulatory challenges regarding its handling of accounts for users under 18. In September, the Irish data watchdog imposed a €345 million (£296 million) fine on TikTok for violating EU data law in its management of children’s accounts, including the failure to shield underage users’ content from public view.
In a separate incident in April, the UK data regulator fined TikTok £12.7 million for alleged misuse of data concerning children under the age of 13. The Information Commissioner’s Office accused TikTok of insufficient actions to prevent the use of the app by children under 13 and unauthorized use of their data without parental consent.
In its community guidelines, TikTok does not make any reference to a waiver based on parental supervision.
The Guardian is conducting an investigation into TikTok, fueled by ongoing concerns about the platform’s moderation of its more than 1 billion users worldwide. Internal communications reviewed by The Guardian are likely to raise new questions about the app’s policing mechanisms.
Evidence observed by The Guardian suggests that certain potentially underage accounts have been assigned internal tags, granting them preferential treatment.
In a specific instance, an account of a child who seemed to be underage received a “top creator” tag. This child creates videos showcasing preparations for school and their after-school routines.
The child also specified in their profile that their parents oversaw the account.
The user included “dm for collabs” in their bio and used the hashtag “tiktokdontbanme” in one of their videos. According to TikTok community guidelines, users must be at least 16 years old to use direct messages.
There was no conclusive evidence of parental supervision for the account, and it had fewer than 2,000 followers.
In another instance, a child who seemed to be under 13 had a “top creator” designation next to their name. This child had accumulated over 16,000 followers on the platform.
The “top creator” tag is assigned to accounts that some moderators have been instructed to handle with more leniency.
According to TikTok’s community guidelines, users must be 13 years or older to create an account. In the United States, there is a distinct TikTok experience for those under 13, featuring additional safety measures and a dedicated privacy policy. TikTok emphasizes in its guidelines that if someone is found to be below the minimum age, their account will be banned.
In the UK, TikTok’s adherence to age limits is governed by the children’s code, which aims to safeguard children’s online data. The code specifies that processing the personal data of a child is legally permissible if the child is at least 13 years old. For those below this age, parental consent is required for the processing of a child’s data.
The code also mandates that services under its jurisdiction adopt a risk-based approach to verify the age of individual users and effectively apply the standards outlined in the code to child users.
The creator of the code, Beeban Kidron, a crossbench peer, expressed to The Guardian her dismay upon learning that it seemed “possible for an underage child to remain on a service once the service is alerted to the fact that the user is 12.”
She further stated, “I believe the sector’s design choices and moderation choices are profit-driven. Those choices can put children at risk of harm.”
In the UK, TikTok is also subject to regulation by Ofcom, the communications watchdog, under its video-sharing platform rules. These rules are being incorporated into the Online Safety Act.
Under the newly introduced act, tech platforms are mandated to outline in their terms of service, which all users agree to, the measures implemented to prevent underage access. Furthermore, these platforms must consistently enforce these terms.
Lorna Woods, a professor of internet law at the University of Essex, emphasized, “Under the Online Safety Act, platforms bear the responsibility to enforce their terms of service consistently. Irrespective of the provisions in the act regarding age verification, if the platform has a rule stating that individuals under the age of 13 are not allowed, it should be applied uniformly.”
Children within the EU benefit from protection under the Digital Services Act, which mandates major platforms like TikTok to implement measures such as parental controls or age verification to shield children from harmful content. The act also implies that platforms should possess a high level of certainty regarding a user’s age, as it prohibits tech firms from utilizing under-18s’ data for targeted advertising.
TikTok claims to have over 6,000 moderators in Europe who enforce the platform’s community guidelines “equally to all content.