Social media fuelled Southport misinformation, UK home secretary says

0 7

The UK government has pushed social media companies to take responsibility for amplifying “misinformation [and] encouragement of violence” after riots broke out in towns and cities across England.

Yvette Cooper, home secretary, said on Monday that social media platforms had “put rocket boosters” under such content and added that the government would pursue the issue with tech companies.

Riots erupted a week ago on the streets of England in the wake of the murder of three young girls in Southport, a town north of Liverpool. The far-right violence and disruption, initially sparked in response to the mass stabbing at a dance class, has spread to cities and towns across the country and led to more than 420 arrests. 

“Social media companies also need to take some responsibility for this,” Cooper told BBC Radio 5 Live on Monday, while noting that the police would also be pursuing “online criminality”.

“Social media has put rocket boosters under some of the, not just misinformation, but the encouragement of violence,” she said. “That is a total disgrace and we cannot carry on like this. So we will also be pursuing this with social media companies.”

Soon after the murders in Southport on July 29, viral online posts on social media platforms proliferated with false information about the attacker’s name and identity — including claims that he had recently arrived in the UK as a migrant crossing the Channel in a small boat and that he was a Muslim.

Axel Rudakubana, 17, was named on Thursday at Liverpool Crown Court, charged with three counts of murder and 10 counts of attempted murder. He is not Muslim or a migrant and was born in Cardiff to parents who had emigrated from Rwanda.

Earlier on Monday, Cooper said a “longer-term debate about the wider legal framework” for tackling online misinformation was now required.

The Online Safety Act, which became law in 2023 after years of debate but is still being put into force, was designed to protect users of online services, including social media platforms. It created sweeping powers for UK media regulator Ofcom to police the tech giants for flouting the rules, including imposing hefty fines and criminal liability for named senior executives in the most serious breaches.

However, the Act only covers misinformation if the content is deliberately false and distributed with the intent to cause “non-trivial psychological or physical harm to a likely audience”.

Other clauses in the wide-ranging legislation, though, may cover some social media content relating to the riots. Under the Act, it is an offence to encourage, promote or provide instructions for violence, or to incite hatred against people of a particular race or religion.

Meta, TikTok and X have been approached for comment.

Marc Owen Jones, an associate professor at Qatar’s Hamad bin Khalifa University, tracked 27mn impressions for posts on X the day after the attack that stated or speculated the attacker was a Muslim, migrant, refugee or foreigner.

Researchers said recommendation algorithms helped to amplify misinformation. The Institute for Strategic Dialogue found that users searching the word “Southport” on TikTok were still being shown a false name as a suggested search query hours after Merseyside Police said it was incorrect. On X, meanwhile, the false name was displayed in users’ sidebars as a topic “Trending in United Kingdom”.

The far-right activist and co-founder of the English Defence League Stephen Yaxley-Lennon, known as Tommy Robinson, has posted continual commentary and videos of the rioting on X. He has repeatedly claimed that the violence in towns including Middlesbrough and Bolton was caused by “mobs of Muslims” rather than far-right rioters. 

Robinson, who was banned from Twitter in 2018 for breaching its policies concerning hateful conduct, was allowed back on to the platform in November after Elon Musk took over the company and rebranded it X. 

In November 2022, a month after the sale of Twitter was completed, Musk said he would grant a “general amnesty” to accounts that had previously been suspended but had “not broken the law or engaged in egregious spam”. He also fired content moderation staff tasked with policing content on the platform.

Sir Keir Starmer on Monday hit out at a claim on X by Musk that “civil war is inevitable” in the UK.

“There’s no justification for comments like that,” the prime minister’s official spokesperson said. “What we’ve seen in this country is organised, violent thuggery that has no place either on our streets or online.”

Olivia Brown, associate professor in organisational behaviour at the University of Bath, said reinstating individuals such as Robinson with “hateful rhetoric”, coupled with a reduction in content moderation on X, “has led to an unprecedented spread of misinformation and hateful rhetoric”.

She added that the platform’s architecture and low barrier to entry “facilitate the spread of information”, making it “impossible to tell if it’s a genuine account or a bot, or indeed a state actor”.

“We know that the very act of interacting with others online can mobilise individuals to act offline,” she added. “What we are witnessing unfold online at the moment is not only concerning for the polarisation of society and community cohesion as a whole but for attempting to understand [and] prevent the violence we are seeing on the streets.”

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy