UK regulator probes TikTok over parental controls

0 2

Unlock the Editor’s Digest for free

TikTok is being investigated by the UK media regulator over concerns the Chinese-owned video app supplied “inaccurate” information about its parental controls as the watchdog intensifies its efforts to protect children from harmful online material.

Ofcom said on Thursday it had “reasonable grounds for believing that” ByteDance-owned TikTok had breached its legal responsibilities and said it might take enforcement action.

TikTok, speaking after news of the probe broke, blamed a technical issue that led to it supplying inaccurate data. The company said it had notified the regulator and was working swiftly to rectify the issue. It said it planned to supply the accurate data as soon as possible.

Ofcom had requested information from TikTok to understand and monitor how the viral video platform’s parental controls worked. The regulator on Thursday said the “available evidence suggests that the information provided . . . may not have been complete and accurate”.

Ofcom is intensifying its work in protecting children from harm as part of its role as the UK’s online safety regulator, following a landmark piece of legislation passed in October.

The UK’s legislation is seen as among the strongest online regulations in the world and Ofcom has pushed to hold companies to account for breaches of the law. This month, it issued guidance to porn websites, forcing them to introduce stricter technical measures to ensure that their users are over the age of 18.

The regulator has also recently gained powers to regulate video-sharing platforms and has been gathering information since 2021 on the leading companies, including TikTok, Snapchat, Twitch and OnlyFans. It can impose fines of up to 5 per cent of qualifying revenue or £250,000, whatever is greater, for companies that fail to comply with its regulations.

Ofcom highlighted in its report on Thursday that TikTok, Twitch and Snapchat rely on users declaring their age when signing up, meaning it is easy to gain access by entering a false age. It chose to focus on these three platforms because of their popularity with under 18s.

The three employ other methods to identify underage users, including human moderators and artificial intelligence detection of language that might suggest they are under 18, for example writing their age in their bio on a profile page.

Ofcom, however, said the effectiveness of such measures was “yet to be established”.

“We asked TikTok for information about its parental control system, Family Pairing, and we have reason to believe that the information it provided was inaccurate,” Ofcom added.

TikTok’s Family Pairing feature, introduced in 2020, allows parents to link their accounts with their teens in order to set limits on screen time, types of content and who can send them messages on the platform.

Research by the regulator has found that more than a fifth of children aged between eight and 17 have an adult online profile, despite many platforms such as TikTok allowing users to have accounts from the age of 13.

TikTok told the regulator that the number of underage accounts removed between June 2022 and March 2023 was about 1 per cent of its UK monthly active user base.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy