Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Use of messaging app Telegram surged in the UK on the day that a peaceful vigil for three girls killed in a mass stabbing in Southport turned into a night of rioting tied to the far-right English Defence League.
Telegram, known for its “hands off” approach to content moderation, is facing renewed pressure to tackle extremist groups on its platform in the UK as it emerged as one of the main tools wielded for mobilising rioters and stoking unrest.
Active users on the app rose to 3.1mn on July 29, the day of the stabbing in the seaside town in northern England, up from an average of about 2.7mn since the beginning of 2024, according to figures from Similarweb, an online analytics company.
That figure jumped to 3.7mn the following day, when a night of violence in Southport, which centred on attacks against a local mosque, left at least 50 police officers injured. Merseyside police said they believed the far-right English Defence League, founded by activist Tommy Robinson, was behind some of the violence.
Telegram usage returned to average levels by the weekend, according to Similarweb data.
The Southport riot ignited a wave of violence across the country, which UK ministers, police and analysts say was both fuelled by and organised through online platforms including Telegram, TikTok and Elon Musk’s X.
UN-backed counterterrorism group Tech Against Terrorism said on Wednesday it was issuing an “urgent alert” regarding the organisation of UK riots by far-right extremists using Telegram. It cited the growth of a 15,000-strong Telegram group that it said was now removed, which had shared a list of protest targets, including immigration-related sites.
“Telegram’s inadequate moderation of extremist channels is contributing to violence and unrest across the UK,” Tech Against Terrorism said.
As several UK towns were braced for further violence on Wednesday, media watchdog Ofcom urged tech platforms to be “proactive” in taking down material that stirred racial hatred or promoted violence.
“We welcome the proactive approaches that have been deployed by some services in relation to these acts of violence across the UK,” Ofcom said. “In a few months, new safety duties under the Online Safety Act will be in place, but you can act now — there is no need to wait to make your sites and apps safer for users.”
Telegram said its moderators “are actively monitoring the situation and are removing channels and posts containing calls to violence”.
“Moderators use a combination of proactive monitoring of public parts of the platform, sophisticated AI tools and user reports to ensure content that breaches Telegram’s terms is removed,” it said.
Founded in 2013 by Russian billionaire Pavel Durov and his brother Nikolai, Telegram has gained prominence for its positioning as an anti-surveillance, “free speech” messaging platform. As a result it has been used by everyone from pro-democracy protesters in Iran to far-right organisers in the UK, and attracted scrutiny for criminal activity on the platform, according to some researchers.
Tell Mama, a group that documents anti-Muslim incidents, said on Monday it had identified far-right posts on Telegram threatening to target immigration solicitors and refugee services in more than 30 UK locations.
Dubai-based Telegram allows users to send encrypted messages privately, or create groups of up to 200,000 members, and “channels” — for one-way broadcasting of messages — with unlimited subscribers. Its guidelines on content moderation state that it does not allow spam and scams, illegal pornography or the promotion of violence on “publicly viewable Telegram channels”.
It also bans terrorist channels, after bowing in 2019 to pressure to take down Isis-linked groups. In the wake of the January 6 2021 attack on the US Capitol building, Telegram closed down public extremist and white supremacist groups involved.
This week, several prominent public Telegram groups organising far-right violence in the UK, including one called “Southport Wake Up”, appeared to have been removed. However, researchers warned that new back-up channels and private groups, which are harder to monitor, continued to share misinformation and racial hatred.
The UK government’s disinformation unit — the National Security Online Information Team — has been compiling examples of social media posts that it believes are spreading disinformation and inciting violence and alerting social media groups to concerning content.
While several companies had been quick to respond by removing flagged posts, X was identified as being less responsive and had kept concerning content up, according to people briefed on the government unit’s activities.
Some researchers warned that public platforms such as X, which attract a broader user base, were being used to recruit people to more secretive far-right organising networks on platforms such as Telegram, including private groups that are difficult to monitor.
Read the full article here