Ofcom is serious about online safety law, but legal challenges remain

Online safety and content moderation are key issues globally. The UK now has a law for it, and regulators are serious about implementing it.

A lot has been made of the UK’s Online Safety Act (OSA) since it was passed into law late last year. On paper, it’s a comprehensive piece of legislation, the first of its kind anywhere in the world, that is intended to ensure that children and adults in the UK are safe from the potential harms on the internet.

It puts the responsibility on online tech giants to make sure their content is moderated and does not pose a risk to individuals or the society.

Implementing it, however, can be a challenge given how the ‘online world’ works. Sceptics are asking some important questions.

For instance, how will a UK regulator force global tech firms and online platforms, some of which are based outside the UK, to comply with the law? The old adage about the difference between passing laws and actually enforcing them comes to mind. Also, where do you draw a line between ‘content moderation’ and ‘freedom of speech’?

But Ofcom, the UK regulator responsible for implementing the law and taking action against violators, has insisted the legislation holds all the necessary answers.

In an interview with the FT, Ofcom CEO Melanie Dawes has said the media regulator “will take “strong action” against tech companies that break new rules on content moderation, even if it has limited powers to stop the spread of lies online.”

Why is the Act important?

The OSA has triggered serious debate in light of the widespread riots in many parts of the UK in August. Many have argued that online misinformation, hate speech and false propaganda were key factors behind the unrest.

It also laid bare the government’s lack of legal authority to block online content that caused harm to individuals, businesses and the society as a whole. The plan is now for the OSA to fill exactly that gap.

The Act requires tech giants and online platforms (social media and websites) to “identify, mitigate and manage the risks of harm (including risks which particularly affect individuals with a certain characteristic) from (i)illegal content and activity, and (ii)content and activity that is harmful to children.”

It also requires online platforms to be safe by design, and that:

  • a higher standard of protection is provided for children than for adults,
  • users’ rights to freedom of expression and privacy are protected, and
  • transparency and accountability are provided in relation to those services.

What are the challenges?

There are many. And one of the biggest challenge perhaps is that not everything that is deemed ‘harmful’ is illegal. The Act allows prosecution of individuals and firms who ‘spread false information with the intention of causing harm’.

But what if someone is simply lying, or makes a mistake when posting online. It’s not illegal, though it may still cause harm. Ofcom accepts this is a legal loophole. It’s CEO told the FT.

“It isn’t entirely straightforward to know how you create rules here that deal with harmful disinformation while also allowing people to have their voice [and] maybe make mistakes,”

Critics are also concerned that the Act will undermine other basic human rights, such as freedom of speech. This criticism has come from some very strong corners too, including from Elon Musk, the owner of X (formerly Twitter) which is one of the largest and most influential social media platforms at present.

Over the past couple of months Mr. Musk has repeatedly suggested that freedom of speech is under threat in the UK.

The challenge for the UK government is that many agree with this opinion. Going back to the August social unrest it is unlikely that they would have happened simply as a result of the spread of online misinformation, without some deeper underlying strains being present within communities and society more broadly.

And the tension between the freedom to express one’s legitimately held views and the need for the law to change to keep pace with increasingly powerful technolgies has always been there. Concern about the curtailing of freedom of speech was partly why lawmakers were forced to cut out certain parts of their legislative agenda when it was being prepared.

But Melanie Dawes has emphasised necessary action will be taken against anyone who violates the law, including X (formerly Twitter). She told FT Ofcom would “make sure that X follows the rules that have been set down in the act . . . and that action needs to take place next year”.