Skip to main content
Advertisement

X to Block UK Access to Terror-Linked Accounts in Ofcom Deal

Elon Musk’s X platform agrees with Ofcom to block UK access to terrorist-linked accounts and review illegal content within 48 hours, amid ongoing concerns over hate and extremist material online.

·2 min read
A silhouetted hand holds a smartphone showing the X logo

Media Regulator Announces X’s Commitments to Combat Terrorist and Hate Content

Elon Musk’s social media platform X has committed to blocking UK access to accounts associated with banned terrorist groups, following an agreement with the UK communications regulator Ofcom aimed at intensifying efforts to tackle terrorist and hate content online.

As part of this agreement, X will review suspected illegal terrorist and hate content within 48 hours and will consult experts on how to manage user reports related to such content.

The UK’s media regulator, Ofcom, announced these commitments as part of a broader initiative to ensure social media platforms implement effective systems to address terrorist and hate material, amid ongoing concerns that harmful content remains insufficiently addressed on major platforms.

Oliver Griffiths, Ofcom’s online safety group director, said: “Following intensive engagement carried out by Ofcom’s online safety team, X have committed to implementing stronger protections for UK users, which we will now monitor closely.”

Griffiths emphasized that the issue of online terrorist and hate content has become increasingly urgent following a series of hate crimes targeting the UK’s Jewish community.

Ad (425x293)

Under the terms of the agreement, X will block UK access to accounts that post illegal terrorist content and are linked to terrorist organizations proscribed by the UK government. Additionally, the platform will review at least 85% of illegal terrorist and hate content flagged through its illegal-content reporting tool within 48 hours. This aligns with the UK’s Online Safety Act, which aims to protect UK users from illegal content, including terror and hate-related material.

Ofcom also stated it is continuing its investigation into X regarding images manipulated with the Grok AI tool, another Musk-owned technology, which reportedly depicts women and girls as partially unclothed.

Danny Stone, chief executive of the Antisemitism Policy Trust, described the agreement as a “good start” but noted that X continues to “fail in so many regards” in addressing racism on its platform.

Adam Hadley, executive director of Tech Against Terrorism, an organization focused on combating online extremism, described the announcement as a “powerful example of what constructive dialogue between regulators and platforms can deliver.”

X has faced ongoing criticism over its content moderation practices since Elon Musk acquired the platform for $44 billion (£33 billion) in 2022, when it was known as Twitter. Last year, Amnesty International accused X of enabling a “staggering amplification of hate” during the riots following the Southport murders in 2024.

This article was sourced from theguardian

Advertisement

Related News