Joint Committee recommends 'major changes' in Online Safety Bill
The Bill should cover more criminal offences to regulate "land of the lawless"
Members of Parliament and peers in the House of Lords have called for 'major changes' to the Draft Online Safety Bill to protect citizens.
The joint committee dealing with the bill said more offences should be covered, including content promoting self-harm, 'cyberflashing' and fraudulent advertising. They also recommended making it illegal to intentionally send flashing images to people with photosensitive epilepsy, and to give porn sites a legal obligation to keep children away.
Additionally, the Joint Committee has said Ofcom should have more powers to make digital platforms responsible for protecting users from child abuse, racist content and other harmful material. They said the regulator should have the ability to investigate, audit and fine tech companies, and define mandatory codes of practice for ISPs - with an accountable 'safety manager' if a service fails to protect users.
Other recommended changes include:
- Requiring tech firms to implement mechanisms to tackle harmful anonymous accounts
- Implementing codes of conduct that deal with 'rabbit hole' algorithms
- Requiring social media firms to reveal how many underage users exist on their platforms
- Creating a digital ombudsman to deal with complaints by individuals against platforms
'Our recommendations start by setting out the purpose of the Bill: to uphold UK law on and offline equally; to protect children, public health, democracy and freedom of speech, including the press, and to hold online services to account for the safety of their products,' the MPs and peers wrote.
'The recommendations put power back in the hands of parliament and the regulator by creating mandatory codes that set expectations of the companies and end the era of catastrophic disregard for the harmful outcomes of their business practices.'
The Online Safety Bill has been years in the making. A draft copy was first published in May, describing Ofcom's future role in regulating tech firms in the UK.
The Bill proposes giving Ofcom power to fine tech firms if they fail to remove illegal or harmful content from their platforms. The fines could go as high as £18 million or 10 per cent of the company's annual revenue, whichever is higher.
The regulator will also have the power to block sites and services. In addition, senior executives at companies that repeatedly break the rules could face criminal charges.
The government has two months to respond to the Committee's recommendations, before presenting them to the Parliament for approval next year.
The Committee's Conservative chair, Damian Collins MP, said the Bill would help to regulate an industry that has become the "land of the lawless."
"A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life," he added.
In recent years, leading tech firms and online platforms, including Facebook (now Meta) and Google, have faced intense scrutiny from lawmakers and regulators worldwide over their business practices.
In October, Frances Haugen - a former Facebook employee-turned-whistleblower - accused the social media platform of prioritising profit over public safety.
She told a parliamentary select committee that Facebook was exacerbating online hate worldwide because its algorithms are designed to promote divisive content.
Haugen said Facebook's internal culture prioritises profitability over its impact on the wider world, and that "there is no will at the top to make sure these systems are run in an adequately safe way."
Haugen's appearance in the UK came one day after Monika Bickert, Facebook's VP of content policy, said the technology industry needs stronger regulation to address issues like misinformation, privacy and harmful content.
Bickert, who has worked at Facebook since 2012, said government regulation can help to establish standards that all companies would be required to meet, enabling people to judge how companies enforce rules on their platforms.
In September, Facebook's semi-independent Oversight Board said online platforms need to have 'clear rules' and should enforce them 'consistently' to give users the confidence that they will be treated fairly on the platform.