The regulatory body examines if the platform violated laws with misinformation
The UK communications regulatory body is investigating TikTok for potentially providing inaccurate information about its parental controls. Ofcom has sought details on TikTok’s Family Pairing system, expressing concerns about the accuracy of the information provided. This has led to an investigation to determine if the Chinese-owned platform has violated the 2003 Communications Act.
TikTok attributed the issue to a technical glitch, acknowledging that it may have caused inaccuracies in the information submitted to Ofcom. The platform identified and reported this problem to Ofcom several weeks ago.
The regulatory body also requested information for a report on how video-sharing platforms (VSPs) were safeguarding users against harmful content.
Ofcom stated, “The available evidence suggests that the information provided by TikTok in response to the notice may not have been complete and accurate.” The regulatory body announced that an update on the investigation would be provided in February.
This announcement coincided with the release of a report on Thursday, assessing how three prominent VSPs—TikTok, Snap, and the video game streaming service Twitch—were ensuring the protection of children from encountering harmful videos.
The report highlighted research indicating that over a fifth of children aged eight to 17 possess an online profile with an account falsely stating their age as 18 or older. Additionally, a third of children aged eight to 15 have an account with a user age of 16 or older, according to Ofcom.
The regulatory body raised concerns about the adequacy of a user’s self-declared age policy during sign-up, prompting a call for platforms to intensify efforts in ascertaining users’ ages.
Ofcom stated, “We therefore expect platforms to explore additional methods to gain a better understanding of the age of their users to be able to tailor their experiences in ways that are more appropriate to their age and that protect them from harm.
The regulatory body disclosed that TikTok employed undisclosed technologies to identify keywords signaling a potentially underage account, while Twitch utilized various measures, including language analysis tools. Snap, on the other hand, depended on user reports to identify underage users.
According to information provided to Ofcom, TikTok stated that the removal of underage accounts in the 12 months leading to March 2023 accounted for just over 1% of its monthly active user base. Over the same period, Twitch reported the removal of 0.03% of its total UK user base, and Snap removed up to 1,000 accounts.