Google sues Germany over hate-speech law, Ofcom hires disinformation auditor to tackle social media
New provision in Germany's law violates the right to privacy of its users, Google says
Google said on Tuesday that it was taking legal action over Germany's expanded hate-speech legislation which took effect in April this year.
In a blog post, the search giant said that a new provision of Germany's Network Enforcement Act ('NetzDG') violates the right to privacy of its users. The provision requires social media platforms to share with law enforcement personal details of those sharing content suspected to be hateful.
Germany's NetzDG law came into effect in early 2018, making social networks such as Facebook, YouTube and Twitter responsible for monitoring and removing hate content from their platforms. It also required digital platforms to publish regular reports on their compliance.
In May 2021, the country's parliament passed legislation to include new provisions in the law to broaden its application, including sharing details of those judged to have shared hate-filled content with the Federal police, a move that was criticised as being heavy-handed by opposition parties and the European Commission, as well as by social media companies themselves.
"In our opinion, this massive interference with the rights of our users is not only in conflict with data protection, but also with the German constitution and European law," Sabine Frank, YouTube's regional head of public policy, wrote in the blog post.
Frank said Google believes that such massive sharing of users' personal data with law enforcement "is only possible after a detailed examination by a court and a judicial confirmation".
"For us, the protection of our users' data is a central concern. We have therefore decided to have the relevant obligations of the legislative package examined by the Cologne Administrative Court as part of a declaratory action."
Ofcom's disinformation auditor
The legal challenge comes as British media regulator Ofcom confirmed on Tuesday that it had appointed Anna-Sophie Harling as its online safety principal to tackle disinformation and harmful content on digital platforms, including Facebook and Google.
Harling is currently the managing director for the European region at NewsGuard Technologies, which audits digital publishers for accuracy of their content.
Harling will lead Ofcom's data initiative under the UK's proposed Online Safety Bill that is due to come into effect later this year (pending approval) and will require tech giants to pass on information to government agencies about content on their platforms.
The bill could also see tech firms fined up to 10 per cent of global revenue.
Last week, Ofcom chief Melanie Dawes also slammed social media platforms over their failure to weed out racist messages and online abuses aimed at three England footballers following England's Euro 2020 final defeat to Italy. Abusive and racist posts targeting the players emerged on Twitter, Facebook and Instagram following their missed penalties.
Dawes condemned the social networks for their slow response and vowed to enforce new powers to fine platforms for negligence.
"Some of our incredible England football team were subjected to racist abuse on the major social media platforms," she said.
"When Ofcom has the power to regulate online safety, we will hold the social media platforms to account on abuse like this. They must be much more transparent about the rules they have in place to deal with it, and we will act to make sure those rules are properly enforced," she added.
BCS, The Chartered Institute for IT also called for digital platforms to start requiring users to verify their identity in order to help curb anonymous racism and other abuses targeted at vulnerable groups.