The UK government is reconsidering bringing in controversial powers to force internet companies to remove “legal but harmful” content, as the first prison sentence was handed to someone who helped fuel recent far-right riots by stoking tensions online.
Officials said there had been conversations in past days about reviving the proposal, which was abandoned in 2022 following a backlash from the tech industry and free speech advocates, but they stressed no decisions have been taken.
On Friday, Sir Keir Starmer said the government will “have to look more broadly at social media after this disorder”, an indication that ministers are minded to strengthen the UK’s incoming online laws.
Dozens of far-right riots have broken out across the UK since a mass stabbing last week in Southport, with unrest fuelled partly by misinformation that spread on social media sites such as X and Facebook.
The government has promised that people who whip up violence online will face prosecutions as well as those who carry out violence in the streets.
Ministers’ primary focus is on gripping the immediate crisis and preventing riots from erupting this weekend. On Friday Starmer visited Scotland Yard and warned that police must remain on “high alert” for unrest reigniting.
Ministers’ consideration of fresh moves to strengthen regulation come after X owner Elon Musk exacerbated tensions on his platform this week, claiming that “civil war is inevitable” in the UK. The provocative remark prompted a slap down from Number 10, which said there was “no justification for comments like that”.
The billionaire also taunted Starmer with the slogan “twotierKeir”, a reference to the widespread claim among the hard right that police treat right-wing protesters more harshly than others.
In the wake of the violence, ministers are now looking at bringing in a measure to clamp down on harmful social media content.
The Online Safety Act was passed last year to govern social media platforms, although it will not come into full effect for several months yet.
It will create sweeping powers for UK media regulator Ofcom to police technology giants for failing to police illegal content — such as hate speech and incitement to violence — including by imposing hefty fines and criminal liability for named senior executives in the most serious breaches.
The present version of the legislation, however, only covers misinformation if the content is deliberately false and distributed with the intent to cause “non-trivial psychological or physical harm to a likely audience”.
Sadiq Khan, the Labour mayor of London who has received torrents of racist and anti-Muslim abuse online, has warned the act was “not fit for purpose”.
Khan called on ministers to review the legislation and told the Guardian: “I think very swiftly the government has realised there needs to be amendments to the Online Safety Act.”
Resuscitating the provisions against “legal but harmful” content, first reported by Bloomberg, could enable Ofcom to force social media platforms to crack down on the kind of misinformation that helped incite the recent rioting — including false claims that the Southport attacker was a recent migrant to the UK and that he was a Muslim.
A previous iteration of the measure was scrapped in November 2022 following an intense lobbying campaign from technology leaders and privacy advocates.
Critics at the time argued that the provision would not just create new liabilities for Silicon Valley giants such as Meta and Google, but also for smaller businesses that host user-generated content online, such as travel-review sites and start-ups. They also warned that it could clash with EU data protection rules and deter multinational technology companies from investing in the UK.
Toby Young, director of the Free Speech Union, said his organisation had opposed the previous government’s efforts to proscribe “legal but harmful” content online “on the grounds that it was a departure from one of the sacrosanct principles of English common law, which is that unless something is explicitly prohibited by law then it should be permitted”. He urged the new Labour administration to shelve the idea.
On Friday, Jordan Parlour, 28, was jailed for 20 months after he posted messages on Facebook about attacking a hotel where asylum seekers were based.
While hundreds have been arrested over the last week’s far-right violence, Parlour’s sentencing at Leeds Crown Court is the first time anyone has been jailed for online activity relating to the disorder.
“Online actions have real consequences,” said Rosemary Ainslie of the Crown Prosecution Service. “People who think they can hide behind their keyboards and stir up racial hatred should think again.”
More than 480 people have been arrested, and over 190 charges brought, in connection to the unrest triggered by the Southport mass stabbing.
Starmer said the strong police presence on English streets and the “swift justice” dispensed in courts across the land had played a role in the disorder easing since Wednesday, when unrest anticipated in 100 locations largely failed to materialise.
Earlier this week home secretary Yvette Cooper served notice that she would examine the legal framework governing large social media companies, as she complained that some firms had been far too slow to take down criminal content during the unrest. The Financial Times reported this week that officials were frustrated that X had been slower than rivals to remove posts.
Cooper also raised concerns that major social media firms were failing to enforce their own rules, which ban hate speech, on their platforms.
Dame Diana Johnson, another Home Office minister, reminded social media giants that they have an “obligation” to deal with criminal offences being committed on their platforms — which does not require the Online Safety Act to come into force.
Read the full article here