Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

People in the UK could also face jail if the image is widely shared. Dominic Lipinski/PA
AI

Creating sexually explicit 'deepfakes' without consent to become a criminal offence in the UK

Anyone who creates a sexually explicit deepfake image without consent will face a criminal record and an unlimited fine.

THE UK GOVERNMENT has introduced new legislation that makes the creation of sexually explicit “deepfake” images without consent a criminal offence.

Deepfakes are digitally manipulated images, video and audio that are designed to create fake material featuring the likeness of an individual.

They have become easier to make in recent years and are increasingly used for malicious purposes.

In the US, several states have already made it a crime to share sexually explicit deepfake images which were created without the person’s consent.

The issue first came to light in 2017 when deepfake porn images of female celebrities started appearing on forum sites like Reddit.

The software needed to create these images is now widely available online and many celebrities, mostly women, have been targeted through the creation of fake, sexually explicit images.

Under the new UK legislation, anyone who creates a sexually explicit deepfake without consent will face a criminal record and an unlimited fine.

They could also face jail if the image is widely shared.

The creation of a deepfake will be an offence irrespective of whether the creator intended to share it or not, the UK’s Ministry of Justice added.

In a statement announcing the measure, the Ministry warned that deepfake images have become more prevalent in recent years, with images being viewed millions of times a month across the world.

Research in 2019 from Deeptrace Labs, the creator of a service designed to identify deepfakes, found that 96% of deepfake videos on the internet were pornographic.

The UK’s Ministry of Justice said today’s move is the “latest step aimed at tackling this emerging and deeply distressing form of abuse against abuse towards women and girls”.

The sharing of sexually explicit deepfake images was criminalised in the UK last year, but this latest measure also targets those who make such images “even if they have no intent to share it but purely want to cause alarm”.

The UK’s Minister for Victims and Safeguarding Laura Farris said the creation of deepfake sexual images is “unacceptable irrespective of whether the image is shared”.

“It is another example of ways in which certain people seek to degrade and dehumanise others – especially women,” she said.

“And it has the capacity to cause catastrophic consequences if the material is shared more widely. This Government will not tolerate it.”

Deborah Joseph, European editorial director of Glamour welcomed the move.

“In a recent Glamour survey, we found 91% of our readers believe deepfake technology poses a threat to the safety of women, and from hearing personal stories from victims, we also know how serious the impact can be,” she said.

“While this is an important first step, there is still a long way to go before women will truly feel safe from this horrendous activity.”

Social media

Celebrities are often the target of sexually explicit deepfakes, as was the case for Taylor Swift earlier this year.

In January, X blocked searches linked to Swift after sexually explicit AI-generated images of the pop star went viral on the site.

One post on X featuring the images was viewed close to 50 million times before it was removed from the platform.

According to US media, the post had been on the platform for around 17 hours.

“It is alarming,” said White House Press Secretary Karine Jean-Pierre when asked about the images at the time.

“Sadly we know that lack of enforcement (by the tech platforms) disproportionately impacts women and they also impact girls who are the overwhelming targets of online harassment,” Jean-Pierre added.

Non-celebrities are also victims, with increasing reports of young women and teens being harassed on social media with sexually explicit deepfakes that are more and more realistic and easy to manufacture.

But the targeting of Swift shined a new light on the issue, with her millions of fans outraged at the development.