Ofcom publishes first set of new online safety rules

First compliance deadline three months from now

Image:
Campaigners for child internet safety are disappointed with scope of rule changes

Ofcom has today published the first set of online safety rules for online service providers subject to the Online Safety Act. Platforms now have three months to assess the risk of their users encountering illegal content and mitigate those risks – or face enforcement action.

Ofcom published proposals about the steps providers should take to address illegal harms on their services shortly after Online Safety Act became law last year.

Since then, the regulator has been consulting with industry, charities and campaigners, parents and children, as well as expert bodies and law enforcement agencies.

However, pressure has been building on Ofcom to move faster in implementing and enforcing better online safety standards. That pressure became more acute after the riots this summer, widely considered to have been fuelled by social media activity.

“This decision on the Illegal Harms Codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm,” Ofcom wrote in a statement.

“Providers now have a duty to assess the risk of illegal harms on their services, with a deadline of March 16, 2025. Subject to the Codes completing the Parliamentary process, from March 17, 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity.”

“We are ready to take enforcement action if providers do not act promptly to address the risks on their services,” it added.

Failure to comply risks fines of up to 10% of global annual turnover (or up to £18 million, whichever is greater).

“The duties in the Act apply to providers of services with links to the UK regardless of where in the world they are based. The number of online services subject to regulation could total more than 100,000 and range from some of the largest tech companies in the world to very small services,” wrote Ofcom.

Speaking to BBC Radio 4’s Today program this morning, Ofcom CEO Melanie Dawes suggested that 2025 will finally see significant changes in how major tech platforms operate.

“What we’re announcing today is a big moment, actually, for online safety, because in three months' time, the tech companies are going to need to start taking proper action,” she said. “What are they going to need to change? They’ve got to change the way the algorithms work. They’ve got to test them so that illegal content like terror and hate, intimate image abuse, lots more..that doesn’t appear on our feeds.”

“And then if things slip through the net, they’re going to have to take it down. And for children, we want their accounts to be set to be private, so they can’t be contacted by strangers,” she added.

In a statement, Technology Secretary Peter Kyle said:

“This government is determined to build a safer online world, where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content.

“Today we have taken a significant step on this journey. Ofcom’s illegal content codes are a material step change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world. If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.

“These laws mark a fundamental re-set in society’s expectations of technology companies. I expect them to deliver and will be watching closely to make sure they do.”

Campaigners disappointed

Child safety campaigners have expressed frustration at both the scope of the rules and the pace of change. The statement from Ofcom is just the latest step in enforcing a sprawling piece of legislation, and the regulator is still assessing further measures and duties including what Dawes described as “wider protections for children.” These more substantive child focused measures may not be enforceable until later next year.

The Molly Rose Foundation, which was set up by the family of Molly Russell, who ended her life after viewing suicide content on social media, made clear its disappointment.

“Ofcom’s task was to move fast and fix things but instead of setting an ambitious precedent these initial measures will mean preventable illegal harm can continue to flourish,” the charity’s chief executive Andy Burrows said.

“While we will analyse the codes in full, we are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.

“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life. Today makes clear that there are deep structural issues with the Online Safety Act.

“The Government must commit to fixing and strengthening the regime without delay.”