Ofcom poaches Big Tech staff in push to enforce new internet curbs

0 4

Ofcom has been poaching staff from Big Tech companies as the UK media watchdog prepares to enforce one of the world’s toughest new regulatory regimes for the internet.

The regulator has created a new team of nearly 350 people dedicated to tackling online safety, including new hires from senior jobs at Meta, Microsoft and Google. Ofcom also aims to hire another 100 this year, it said.

The staff increases are a response to the Online Safety Act, which became law in the UK in October. It gives the media watchdog sweeping new powers to oversee some of the biggest companies in the world as well as hundreds of thousands of smaller websites and apps.

The law gives Ofcom powers to hold platforms responsible for any illegal material, such as child sexual abuse and terror content, as well as broader duties to protect kids online.

“The expectations are very high, but it’s as quick as I’ve ever seen a regulator act,” said Melanie Dawes, chief executive of Ofcom. “Nothing is ever fast enough.”

The OSA became the centre of an emotive debate during its journey through parliament as bereaved parents of teenagers, who died by suicide after discovering harmful material, urged politicians to move faster and hold companies accountable.

“There is a reputational risk for Ofcom since there is a clear danger that public expectations will be raised beyond the point that they can reasonably be satisfied,” said a former non-executive Ofcom board member. “Given the scale of the task, that must be a distinct possibility.”

Ofcom has published lengthy documents and codes of conduct in the hope that providing detailed guidance will encourage compliance before the regulator’s formal enforcement powers kick in over the course of the next two years.

The regulator will then be able to fine companies that flout the rules up to 10 per cent of annual global turnover and pursue criminal cases against named senior managers who fail in their duty of care.

Ofcom has estimated that implementing the act will cost £166mn by April 2025, of which £56mn will have been spent by April this year. The regulator plans to create a fee structure for companies to recover costs.

Ofcom also expects that many of its decisions may need to be defended in the courts, with tech companies keen to challenge unclear aspects of the act to clarify the law. That will test how effective the watchdog is up against the legal teams of some of the world’s deep-pocketed tech companies.

“We are fully prepared to take risky cases in terms of our own legal exposure,” said Suzanne Cater, director of enforcement at Ofcom. “We will be up against some big companies; there could be a very hostile environment here.”

The new powers, however, have attracted executives from leading companies. Almudena Lara, who joined Ofcom in June from Google, said she moved to continue her work in child safety. “Most in the tech sector have the right motivations, but the reality is that it’s very hard sometimes to prioritise user safety,” she said.

Jessica Zucker joined as Ofcom’s director of online safety policy in June 2022, having previously led Meta’s misinformation policy team in Europe and after working at Microsoft.

She said there was “overwhelming” interest in roles at Ofcom, helped by job cuts in the tech sector over the past two years, in which companies cut tens of thousands of workers — including at Elon Musk’s takeover of X, formerly Twitter, who slashed the company’s trust and safety teams.

“Those still motivated by online safety and proportionality see Ofcom as the alternative,” said Zucker. “You could do it for one company, or you can do it for an entire industry.”

Zucker has led the regulator’s work to tackle video-sharing platforms, such as TikTok, Snap and Twitch, with new powers to compel groups to hand over details of their operations, including how they protect children, or face

The regulator in December opened an investigation into TikTok after the viral app supplied it with inaccurate data on its parental controls. In response, TikTok said it had made a technical error and was working to rectify the issue.

“If companies don’t comply, we will be taking a pretty hard line because the whole regime rests on our ability to be able to gather accurate information when we need it,” said Cater. “There’s no wiggle room here.”

Ofcom also plans to engage with the largest or riskiest online services through regular supervision meetings to discuss compliance, led by Anna Lucas, director of supervision. As a result of such discussions, OnlyFans introduced age verification technology for its users, and Twitch removed sexual content from its homepage.

Critics have warned that the sweeping nature of the act would leave Ofcom stretched, regardless of the recent hires dedicated to enforcement.

“The central problem is scale. Nobody knows how that will play out, but Ofcom’s view is you have to start somewhere,” said a former senior Ofcom employee. “You can’t boil the ocean, and so there will be a filtering of these services by perceived risk.”

“There are no silver bullets to online safety,” said Gill Whitehead, who leads Ofcom’s online safety efforts, having previously worked at Google. She described the content she had encountered in her role as “deeply disturbing”.

But, she added: “There is a real purpose . . . because even in changing one child’s life [makes] a massive difference.”

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy