Home / Royal Mail / UK media regulator on hiring spree amid pressure to bolster online safety

UK media regulator on hiring spree amid pressure to bolster online safety

Unlock the Editor’s Digest for free

Ofcom is ramping up hiring for its online safety workforce as concerns rise that the regulator lacks sufficient powers to curb the kinds of misinformation that sparked recent violent unrest in the UK.

The media regulator told the Financial Times it has more than 460 people working on the Online Safety Act, with plans to expand by 20 per cent to more than 550 by next March. This would equate to a third of Ofcom’s 1,500 total staff.

The hiring spree comes amid a wider debate in the wake of violent unrest across the country this month over whether the government can effectively tackle the rise of false information online.

The riots were initially sparked by a mass stabbing in Southport on July 29, in which three girls were killed and eight other children and two adults were injured. Violence was stoked by the sharing of misinformation and disinformation on social media platforms following the attack.

Column chart of Planned and actual increase in staffing showing Ofcom boosts expertise to enforce online safety

X owner Elon Musk’s recent posts, including references to UK Prime Minister Sir Keir Starmer’s “two-tier policing” and a retweet by founder of the far-right English Defence League Tommy Robinson, have highlighted concerns over the state’s power to hold the tech billionaire to account.

The UK’s Online Safety Act was passed last year, but will not come into full effect until late 2025 or 2026.

The legislation contains one provision for dealing with disinformation, making it illegal to send a message that contains information that an individual knows to be false with the intention of causing “non-trivial psychological or physical harm to a likely audience”.

Beyond this, the act will also oblige companies to adhere to their own terms of service that are currently being drafted.

“If you’re more free speech inclined, you might not put ‘cracking down on the spreading of misinformation’ in your terms of services,” said one person involved in drafting the legislation.

Several politicians, lawyers, and lawmakers argue the Online Safety Act in its current form does not go far enough to hold tech companies and their executives to account for allowing false truths to proliferate on their platforms.

London Mayor Sadiq Khan argued in the wake of the riots that it was “not fit for purpose”, while Starmer indicated he would look at ways to “toughen up” the legislation.

“Even when enforced, the Online Safety Act would likely account for a fraction of the problems that happened online before and during the violence,” said Josh Simons, new Labour MP for Makerfield who used to work on responsible artificial intelligence at Meta.

“If we think it’s toxic and corrosive of our public sphere to have lies circulating at speed, we need to have the public interest in the minds of the people building those algorithms,” he added.

Simons, who was head of the Labour-affiliated think-tank Labour Together until last month, is among a number of lawyers and policymakers who believe the government should develop a new law to tackle the algorithms that help spread misinformation.

Josh Simons
Josh Simons, the new Labour MP for Makerfield, used to work on responsible artificial intelligence at Meta © Charlie Bibby/FT

Others argue that extending the legislation could create excessive responsibilities for the regulator, which is already battling to define the rules that govern offences in the act.

Ofcom is concerned about ensuring efforts to hold tech companies to account, by issuing fines under the OSA, are watertight. The regulator expects penalties to face legal challenges from some of the richest and most powerful groups.

Losing cases can be financially and reputationally damaging. Last week, the German government suffered a blow when the courts overturned its decision to ban the far-right magazine Compact.

The decision was seen as a gift to the nationalist periodical, which boasted of a “victory of David over Goliath” and increased readership on the back of the ruling.

One lawyer who asked to remain anonymous said they could not imagine a regulator “better equipped to implement this law” than Ofcom, given its expertise and manpower. But they said they believed the regulator had been given an “impossible task”. 

They added that extending the legislation to include misinformation, “the more risk there would be of arbitrary decisions being made” by governments and platforms who are being asked to become “arbiters of the truth”.

Ministers have this month looked at bringing in powers to force internet companies to remove “legal but harmful” content, originally part of the Online Safety Act that was removed after months of wrangling and pushback from free speech proponents.

Many experts are sceptical that reintroducing the clause will solve the problem given it could add to Ofcom’s responsibilities, forcing the regulator to weigh in on the complex question of what is deemed “harmful”.


Source link

About admin

Check Also

Celebs & TV News – Latest celebrity and TV news from Somerset Live

We can’t find the page you requested The file could not be found for a …

Leave a Reply

Your email address will not be published. Required fields are marked *