A damning new report released by counter-hate researchers at the Centre for Countering Digital Hate (CCDH) reveals that Elon Musk's X platform continues to be a fertile ground for violent anti-Muslim and anti-migrant hate, actively profiting from content that fueled widespread disorder and riots across the UK last year. One year on from the devastating Southport stabbings and subsequent civil unrest, the CCDH warns that "little has changed," with X failing to moderate an "explosion of dangerous, violent content" despite clear connections between online posts and real-world violence.
The CCDH's analysis, which has previously drawn the ire of X's billionaire owner, found that high-profile "hateful influencers" on the platform amassed millions of views daily in the aftermath of the Southport tragedy in July 2024. These figures, including Tommy Robinson, Paul Golding, Ashlea Simon, Andrew Tate, Laurence Fox, and Calvin Robinson, falsely linked the attack to Muslims and migrants, directly contributing to widespread unrest. Ofcom, the UK's online safety regulator, subsequently concluded there was a "clear connection" between social media posts and the eruption of the 2024 riots.
Worryingly, the CCDH's latest research indicates that the "same forms of violent and murderous rhetoric that precipitated and inflamed the 2024 riots" are still circulating widely on X with scant moderation. The report highlights how "parent" posts from these six far-right or extremist influencers frequently trigger "intensely violent" replies from users, encouraging extreme acts such as calls to "shoot, to maim, and to kill." Shockingly, X not only amplifies this content through its algorithms but also monetises it. All six identified accounts are 'verified' Blue Tick users, meaning their posts are actively promoted and they are highly likely to generate income through X's Creator Revenue Sharing programme. Three of these accounts (Fox, Robinson, and Tate) further profit through paid subscriptions to their content.
The CCDH used an AI model to identify 4,379 violent replies targeting Muslims or migrants in response to just 322 posts from these six accounts over the past year. These replies garnered at least 383,102 views and included chilling calls for violence, such as "We need to shoot them at the point of entry, as they unload from the dinghies," and "I would like to go postal on the Muslims! They are evil as hell!" A significant number of posts continue to call for Muslims to be "exterminated" and "executed." Tommy Robinson's posts generated the most violent replies, followed by Paul Golding and Ashlea Simon.
The report also underscores Elon Musk's personal role in exacerbating the issue. During the 2024 riots, X emerged as a "crucial vector of false information and hate," with Musk himself "personally amplified conspiracy theories, warning of an impending 'civil war' in Britain to his hundreds of millions of followers." The CCDH notes that "hate preachers" previously banned from Twitter were reinstated by Musk and continue to receive millions of views daily. This comes as recent reports from May 2025 further detail how Elon Musk and far-right networks leveraged X to amplify Islamophobia in the UK, particularly concerning "grooming gangs" narratives, with engagement metrics orders of magnitude higher in early 2025 compared to 2024.
The findings are expected to intensify pressure on the Labour Government to take stronger action against X and reconsider its own extensive use of the platform. While the UK's Online Safety Act 2023 is now in force, with Ofcom implementing various duties for tech companies to identify, mitigate, and manage risks of illegal and harmful content, critics argue that these measures deal with abuse after it has happened, rather than tackling the business model that actively amplifies and encourages it. As of July 2025, a government petition to review penalties for social media posts, including imprisonment, has garnered significant support, reflecting growing public concern over online harms. Despite these developments, the Prime Minister's office has previously dismissed calls to abandon X, viewing issues like its AI chatbot endorsing Hitler as "a matter for the company."
The CCDH’s stark warning one year after the Southport riots serves as a powerful reminder of the tangible dangers of unchecked online hate and the urgent need for platforms like X to prioritize safety over profit.