REGULATION
All of this has unfolded just as the UK’s landmark Online Safety Act (OSA) has come into effect – a sweeping piece of legislation, passed through parliament in 2023, aimed at making the internet safer.
Since March, the law has been enforced by the industry regulator Ofcom, which now holds power to compel platforms to act against illegal material, like terrorism, child sexual abuse content or hate speech.
Some users on X say age verification measures, which Ofcom has required platforms like X to have in place since July, are being used to squash political speech.
“The argument made in favour of the Online Safety Act is to protect children from seeing material online which they shouldn’t see. Obviously I support that,” says Ben Habib. “However, the Online Safety Act no doubt is being used as a mechanism by which to patrol for extreme views.”
In response to our reporting, Ofcom told Sky News that the OSA does not require platforms to ensure political balance in what is posted on their platforms, and that it is not Ofcom’s job to “tell platforms which specific posts or accounts to take down”.
Their statement added that “in due course, some of the most widely-used sites and apps, will be subject to additional duties under the Act, including protecting content of democratic importance”.
MISSION IMPOSSIBLE
The platform’s influence on Britain’s streets is no longer subtle – it’s stark and unmistakable. But finding a fix that satisfies all sides of the debate remains a near-impossible task.
The public is wary of any government attempt to police free speech – online or in the real world, in ways that could threaten democracy. On the other hand, giving social media companies unchecked controls over what appears in users’ timelines creates its own tensions.
Experts who monitor extremism online say that, in an ideal world, social media platforms would use politically neutral algorithms to surface content based purely on user interests. Our analysis shows that’s not what is happening.
Instead, owners of large platforms like X can influence the algorithm’s code, or directly decide what users see to serve their own priorities or increase engagement times by promoting content that will keep people hooked on the platform.
As a result, platforms built as public town squares for self-expression and connecting with others are now more like private property, where the loudest voices belong to those with power, or those with views that align with the views of people who control the algorithm.
Countless reports and academic studies have explored political bias on X, but Sky News’ investigation offers unprecedented evidence of how the effects of that imbalance are playing out in the UK.
“These findings confirm what a lot of us feared and suspected from our own feeds, but couldn’t prove,” said Sir Ed Davey. “It’s a major contribution to the debate that I think we have to have.”
