Online Safety is here and Ofcom is ready in the wings
Regulator is ready and waiting to test its new power
After a long wait, the Online Safety Act is approaching its final form. All companies must be ready if they want to avoid heft fines, write Emma Wright and Grace Tang of Harbottle & Lewis.
Content that is illegal and harmful to vulnerable groups has long circulated online, and the lack of protection for users led the UK to the long and arduous path of writing legislation to make sure those users were safe.
After private members bills, a green paper in 2017, extended periods of comprehensive market studies, industry consultations and heated debates, the Online Safety Act (OSA) finally received Royal Assent in Autumn 2023 with the recognition that much of the implementation would be delegated to the new online safety regulator, Ofcom.
Ofcom has pushed on with this: launching a consultation on online harms 12 months ago (November 2023); producing draft guidance of online pornography services a month later (December 2023) and in the spring of this year publishing the draft Children’s Safety Codes of Practice (May 2024), setting out how online services are expected to meet their legal responsibilities to protect children online.
At the same time, Ofcom set the expectation that the final Children’s Safety Codes of Practice would be published ’within a year,’ with services then having three months to conduct their children’s risk assessments.
Ofcom has recently updated its implementation plan. The regulator is on standby, with chief executive Melanie Dawes declaring that Ofcom is ‘absolutely prepared’ to impose heavy sanctions.
Will I be caught by the OSA?
If you run user-to-user services or search services with any sort of connection to the UK then the likelihood is yes; the Act is extremely far reaching in who it applies to. Cumulative additional obligations will apply to different sets of categorised services, depending on factors such as the number of UK users and the service’s specific functionality.
The duties are considerable; the specifics are not covered here but include the following:
- Carrying out illegal content risk assessments;
- Complying with illegal content safety duties (including taking steps to prevent users encountering such content, removing illegal content alongside a user reporting mechanism, and to reflect such processes in their terms of service or a public statement);
- Carrying children’s access and children’s risk assessments;
- Complying with age assurance duties (for pornographic content services);
- Complying with children safety duties (including preventing them from encountering certain types of harmful content); and
- Responding to Ofcom’s information notices and transparency notices.
The Act also imposes some new criminal offences, such as sending false communication with the intent to cause harm, cyber-flashing and encouraging or assisting self-harm.
Online service providers are facing a significant and comprehensive legal framework that requires them to consider and address all manner of internet safety.
The duties detailed above require risk assessments, assessing what are proportional measures to target issues for the individual business and services, with a particular focus on safeguarding children. Adults aren’t left behind, as providers have to give more autonomy and discretion for adults to manage their online experience (such as filtering content and blocking anonymous contacts).
What does this mean for technology businesses and social media companies?
First and foremost, businesses need to assess whether they fall under the scope of the OSA and if so, at what level. Careful review of the draft guidance, together with anticipating the actions that will arise when Ofcom releases final version and preparing the information and evidence internally for risk assessments, are key steps to carry out urgently.
These widespread duties are accompanied by Ofcom’s significant enforcement powers. This includes penalty notices of up to £18 million or 10% of global revenue (whichever is greater) to the worst perpetrators. Ofcom is currently consulting on how it will calculate maximum penalties.
The regulator is also not holding back as we’ve seen from press statements., Ofcom has already fined TikTok £1.875 million for failing to comply with its duties to provide information in response to a formal notice by Ofcom under the Communications Act. We see no reason why Ofcom will temper its approach for those caught by the OSA.
It’s also important for firms to be aware that there will be a fee regime (though not until 2026/2027). It’s currently expected that a firm will only need to pay Ofcom if they reach a certain level of worldwide revenue.
Tech firms and social media platforms will also have to grapple other jurisdictions implementing their own online safety regimes. Ireland recently adopted its online safety code for video sharing platforms as part of its Online Safety Framework. Australia has introduced an online misinformation bill, which has not been without controversy. The UK and US have agreed to boost global efforts with a joint children’s online safety working group, aiming to promote transparency from firms.
Of course, jurisdictions naturally hold different and adverse perspectives on what is necessary to protect their citizens in the online space, but firms (especially new entrants) may find the varied rules difficult to manage.
Firms will have to keep in mind collaboration between regulators; for example, the joint statements by Ofcom and the ICO with the intersection of online safety and personal data, and the Digital Regulation Cooperation Forum, which brings together the Financial Conduct Authority, Competition Markets Authority, ICO and Ofcom.
Final thoughts
The OSA has been on the horizon for a long time, and while certain firms may still feel unprepared facing the new regime, Ofcom is taking a strong stance against non-compliance.
This summer the UK has seen the effects caused by disinformation, with waves of violence and riots throughout the country. Firms have been criticised for the disinformation and it has become a focus, as seen with Ofcom’s statement on these events - whilst referring to the OSA. There will likely be significant action around firms applying their terms of services (which often ban harmful disinformation). The new government will review the Online Safety Act following the riots, so this remains high on the agenda.
Melanie Dawes’s statement that “2025 will be a pivotal year in creating a safer life online” is one that we’re in definite agreement with and this will be a challenging topic for many years to come.
Emma Wright is partner and technology lead and Grace Tang is an associate at Harbottle & Lewis.