UK criminalises deepfakes: lawyers respond

'Important and much needed reforms to close the gap in current legislation’

On Tuesday the government announced a number of proposed new rules and amendments to existing measures in response to the rising menace of sexually explicit deepfake images and videos circulating on the web.

The new rules extend existing legislation on revenge porn by including fake sexually explicit images and videos that appear to feature the victim but which were created by AI: they are not real, but they look real.

Revenge porn, or to give its more recent official descriptor “intimate image abuse” (IIA), which takes into account that revenge may not be the motive for sharing personal images and videos, was criminalised in 2015.

The government hopes that the prospect of a two-year prison sentence will give would-be perpetrators pause for thought, although the domestic violence charity Refuge has reported low rates of follow-up by the police on reports of IIA under the existing rules, and of victims withdrawing accusations for fear of the consequences.

Nevertheless, legal experts and charities broadly welcomed the proposed new laws, which criminalise the creation of sexually explicit deepfakes; the recording of intimate images without consent; and facilitating the use of equipment to take an intimate image without consent.

“These are important and much needed reforms to close the gap in current legislation, which is limited only to sharing or threatening to share intimate images without consent,” said Emma Woollcott, partner and head of reputation protection and crisis management at law firm Mishcon de Reya.

The government must take measures to futureproof the rules around IIA, she said, as technology, has rapidly outpaced legislation and this form of abuse is growing rapidly.

Ninety-eight percent of deepfakes are pornographic and 99% of those are of women and girls. And while there have been instances where famous women have had deepfakes images created of them, the campaign group, My Image, My Choice found that the most targeted group of people are ordinary women and girls.”

Victims often feel “utterly dehumanised” and are made “physically sick” by seeing themselves depicted in this way, Wollcott said.

Deepfakes are frequently used to target people, predominantly women, “whether in an attempt to silence them, or punish them for an unpopular position,“ said Will Richmond-Coggan, a privacy and data litigator at Freeths LLP.

“But even where the purpose is not a public attack on the individual these deepfakes are far from harmless. From a legal perspective, there is the issue that they are using technology to make inferences about a person's private information and creating false personal data about someone's sex life and sexuality.”

According to a recent survey [pdf] commissioned by cybersecurity vendor Kaspersky with support from SWGfL, a UK charity that hosts the Revenge Porn Helpline, A third (33%) of UK respondents said they either know someone who has suffered this form of online abuse or survived it themselves (7%).

Fourteen percent of those surveyed admitted sharing explicit images of other people, 11% for revenge. In this, they can take advantage of the fact that younger people in particular feel comfortable with sharing images of themselves in chat forums, with 20% of UK respondents also saying they have intimate images of themselves stored on mobile devices.

Sophie Mortimer, Revenge Porn Helpline manager at SWGfL, said: “We can see every day that intimate image abuse is a continuing problem, but this study shows us where we need to take action: building a national and international conversation about the meaning and importance of consent, improving online safety knowledge for adults and young people alike and making it clear that, when intimate image abuse happens, it is the perpetrators who are entirely at fault.’’