EU proposal to scan private messages for child abuse may be illegal
Lawyers say proposed client-side scanning measures likely break the law in several areas
The European Union (EU) has received internal legal advice indicating that a proposed law requiring technology companies to scan private and encrypted messages for child abuse material is likely to be annulled by the courts.
In an effort to combat the significant amounts of child sexual abuse material (CSAM) that are uploaded to the internet every year, the European Commission last year announced measures intended to provide additional safeguards to protect children from online predators and harmful content on the internet.
The proposed regulations could require tech firms to identify both newly uploaded and previously identified instances of CSAM, as well as potential instances of grooming.
The detection might occur in chat messages, data sent to online services, or on websites hosting offensive content. The service providers would be allowed to deploy any detection technology they prefer, basically scanning all private conversations.
If the proposed measures become EU law, they will be applicable to online hosting services, messaging apps, internet service providers and app stores, among other interpersonal communication services.
After receiving the order from national bodies, service providers would also be required to notify law enforcement authorities if they discover evidence of suspected harmful content being shared or the grooming of children.
Privacy advocates have expressed concerns that the proposed measures could harm end-to-end encryption, infringe on individuals' online privacy, and introduce "mass surveillance" across the EU.
Similar proposals in the UK's Online Safety Bill have met with objections from technology companies as well as privacy and civil liberties advocates.
The legal service of the Council of the European Union has now warned that the proposed regulation presents a "particularly serious limitation to the rights to privacy and personal data" and there is a "serious risk" that it could be found to be in violation of multiple legal grounds upon judicial review.
The leaked internal legal advice was presented last month to diplomats representing member states of the bloc.
As reported by The Guardian, the EU lawyers stated in their legal advice that the draft regulation would necessitate the general and indiscriminate screening of data that is processed by a specific service provider. It would apply without distinction to all individuals who use that particular service, even if they are not in a situation that could result in criminal prosecution, either directly or indirectly, and that this could be illegal on multiple grounds.
The proposals suggest that tech firms would have to make one of the following choices: abandon end-to-end encryption, introduce a form of backdoor to access encrypted content, or gain access to content before it is encrypted through the installation of client-side scanning (CCS) technology on users' devices.
"It appears that the generalised screening of content of communications to detect any kind of CSAM would require de facto prohibiting, weakening or otherwise circumventing cybersecurity measures," the lawyers wrote.
They cautioned that the European Court of Justice has previously ruled that the screening of communication metadata is proportionate only for the purpose of safeguarding national security.
As a result, it is "rather unlikely" that similar screening of communication content for the purpose of combating child sexual abuse would be considered proportionate.
Ten member states of the EU are said to be in support of continuing with the regulation without any amendments, leaving 17 against or non-committal.
MEP Patrick Breyer, a member of the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (Libe), has urged the current EU presidency, held by Switzerland, to remove the blanket monitoring of private communications from the proposed legislation.
"The EU Council's services now confirm in crystal clear words what other legal experts, human rights defenders, law enforcement officials, abuse victims and child protection organisations have been warning about for a long time: obliging email, messaging and chat providers to search all private messages for allegedly illegal material and report to the police, destroys and violates the right to confidentiality of correspondence," Breyer said.
"What children really need and want is a safe and empowering design of chat services, as well as Europe-wide standards for effective prevention measures, victim support, counselling and criminal investigations," he added.