Ofcom investigating TikTok's parental controls
TikTok blames technical glitch
The UK media regulator, Ofcom, has launched an investigation into social video platform TikTok over concerns that the app provided "inaccurate" information about its parental controls.
Ofcom said it had "reasonable grounds for believing" that TikTok, owned by the Chinese company ByteDance, may have breached its legal responsibilities.
The regulator believes TikTok may have misled it regarding its Family Pairing system, potentially violating the 2003 Communications Act.
TikTok's Family Pairing feature enables parents to link their accounts to their children's accounts, granting control over settings such as screen time limits.
Ofcom said that, based on initial findings, information it had requested from TikTok about the feature may be inaccurate.
TikTok responded to by attributing the issue to a technical glitch that occurred several weeks ago, leading to the provision of inaccurate data. The company said it had notified Ofcom and was working to rectify the problem.
The company now says it plans to supply accurate data as soon as possible to address the regulator's concerns.
More power
Ofcom's investigation falls under its remit to protect children from online harm, following the enactment of landmark Online Safety Act that positioned Ofcom as the UK's online safety regulator.
The new legislation is considered among the world's most robust online regulations, and Ofcom has been actively holding companies accountable for legal breaches.
The regulator has now gained powers over video-sharing platforms, including TikTok, Twitch, Snapchat and OnlyFans. It can fine companies failing to comply with regulations up to 5% of qualifying revenue, or £250,000.
On Thursday, Ofcom published a report on the protection of users. It assessed the protection measures implemented by TikTok, Snapchat and Twitch, two years after it issued guidance on safeguarding younger users from harmful content.
While all three platforms met the requirements set out, Ofcom is still concerned about potential harm that young users may face when using these platforms.
The regulator is especially concerned about children providing false ages. More than a fifth of children aged eight to 17 reportedly have online profiles misrepresenting their age as 18 or older.
Ofcom questioned the efficacy of self-declaration policies and urged platforms to employ additional measures to verify users' ages.
TikTok says it uses (unnamed) technologies to identify potentially underage accounts through keyword detection. Twitch uses language analysis tools, while Snap relies on user reports.
According to TikTok, over 1% of its monthly active user base consisted of removed underage accounts in the 12 months leading up to March 2023.
Twitch and Snap reported removing 0.03% and up to 1,000 accounts, respectively, over the same period (a fairly pathetic showing from Snapchat, which has hundreds of millions of daily users worldwide).
Ofcom will provide further updates on its investigation into TikTok in February.