Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Head of the Child Protection Group Javier Izquierdo de la Rosa speaking to media about the investigation into images of nude minors created or manipulated with AI Alamy Stock Photo
VOICES

Fake Porn, Real Victims We must stop the easy use of AI to create nude images of women & girls

Two experts look at the worrying case of deepnude image generation in Spain this week.

DID YOU KNOW that AI technology is being used to turn everyday, ordinary photos of women and girls into pornography?

This reality shocked many last week when it emerged that social media posts of young girls in Spain had been used to generate nude and explicit sexual images using AI – what is often referred to as deepfake or deepnude porn.

The girls are as young as 11 and the images had been created, and circulated widely, without their knowledge or consent. A number of young boys have been identified as responsible and the police are investigating.

the-minister-spokesperson-of-the-government-isabel-rodriguez-appears-after-the-meeting-of-the-council-of-ministers-at-the-moncloa-palace-on-september-19-2023-in-madrid-spain-during-the-press Govt minister Isabel Rodríguez on 19 September in Madrid (Spain). She addressed the social alarm generated by cases such as that of several minors involved in the alleged manipulation, with AI, of images of girls (also minors) without clothes, denounced by their families in Almendralejo (Badajoz). Alamy Stock Photo Alamy Stock Photo

Most of the debate following these revelations has focussed on potential criminal charges against the boys, the need for improved cybersecurity and the familiar blame game targeting either parents for giving young people phones or the girls themselves for posting on social media.

But this misses the mark. We need to set our sights elsewhere.

Who is responsible?

First, the tech companies and particularly search engines like Google. Type ‘deepfake porn’ or ‘nudify’ into Google and it immediately offers links to websites dedicated to creating non-consensual pornography from everyday photos. Within minutes, you can upload fully clothed photos of female colleagues, friends and celebrities, including young girls, and get a nude image back.

With a little more time, you can generate full pornographic videos using ‘deepfake’ apps. This AI-generated porn can be distributed far and wide with few being able to tell whether it is real or fake.

That Google showcases these websites makes them seem legitimate. In practice, Google is facilitating the abuse and harassment of women and girls. It could de-rank these websites, making them much more difficult to find. We’ve known about this technology for years and so it’s time to act. Campaigns are beginning and new online regulation in the UK, Ireland and the EU’s Digital Services Act should hasten action by Google and others, though only if regulators are proactive and take violence against women and girls seriously.

The porn question

But while making these websites more difficult to access would be a good start, we also need to think about why some men and boys are using AI technology to exploit women and girls. And this is where mainstream pornography requires attention.

The nature of porn has changed over the past few decades and women’s non-consent and humiliation is now centre-stage.

Our research found that sexually violent pornography is being shown on mainstream sites to first-time users, including whole genres on the creation and sharing of non-consensual imagery – commonly referred to as image-based sexual abuse. Young people today are growing up with free access to this kind of porn and it is setting their baseline for what sex is, who it’s for and how to do it. It should therefore be little surprise that young boys and men are seeking to create their own non-consensual porn, with little care as to the harm they are causing.

This doesn’t mean that individuals using AI technology to create ‘deepnudes’ aren’t responsible for their actions. But it does show the urgent need for us to face up to the content of mainstream porn and recognise its role in creating a conducive context for the abuse and harassment of women and girls. It won’t be easy, but change is possible.

Role of legislation

At the moment the EU is debating possible new criminal laws targeting online abuse including deepfake porn. While an early draft actually excluded nudes, the latest version thankfully remedies this gap. But, on the other hand, EU Governments seem to want to severely limit any new law by providing a defence of ‘freedom of expression and art’.

Such an approach fails to understand that deepfake porn is being used right now to curtail women’s freedom of speech, trying to force us out of public life. This was powerfully shown in a recent documentary featuring politician Cara Hunter from the north of Ireland who had fake images of her distributed when she was campaigning for election.

But it’s not only politicians and celebrities who are being threatened. As the case of the young girls from Spain shows, all women and girls are potential victims of this epidemic of deepfake porn and non-consensual nudes. None of us are safe and our freedom is being restricted right now. Indeed, that’s the point.

So, internet intermediaries like Google must be made to act to reduce the ease of access to this technology. The draft EU directive must be amended to give comprehensive protection to women and girls from deepfake porn. Governments and regulators must recognise the immediate threat and danger of AI to women and girls and take proactive steps to demand change. And, finally, we need a cultural reckoning with the reality of mainstream porn. This may be ‘deepfake’ porn, but it creates real victims.

Clare McGlynn is a Professor of Law at Durham University in the UK. Fiona Vera-Gray is the Deputy Director of the Child and Woman Abuse Studies Unit at London Metropolitan University.

VOICES

Author
Dr Fiona Vera-Gray & Professor Clare McGlynn