Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Meta showed nearly 10,000 ads for deepfake clothes 'erasing' app to Irish users since December

Legal experts have warned that so-called ‘nudify’ apps are not against the law.

LEGAL EXPERTS SAY there is no law against creating explicit, non-consensual deepfake images of women, despite legislation banning such images from being shared.

The Journal reported earlier this week that so-called ‘nudify’ apps, which allow users to create non-consensual, fake nude images of any woman, have been promoted to Irish Facebook users in recent weeks.

The apps work by asking users to upload regular clothed photos of women, and use artificial intelligence to ‘erase’ their clothing and produce a pornographic image instead.

A subsequent analysis of the Meta ad library has found that one service offering to “erase” women’s clothing featured in almost 10,000 sponsored posts pushed to Irish users on Facebook, Instagram and Messenger since December.

Meta has removed the ads for breaching its advertising policies, which prohibit nudity and sexual exploitation.

One version of ad claimed that the app “can see through objects” and showed a video of a woman dancing with her clothes on, followed by a brief shot of the same woman apparently undressed.

But despite being deleted, those pushing the so-called ‘nudify’ app managed to circumvent Meta’s moderation by posting new versions of the ads on different profiles each time they were removed.

Although some ads were removed by Meta after less than an hour, the company’s ad library also shows that others were active for as long as four days.

The Journal searched the ad library using four separate links to the AI app and found around 9,500 ads containing those links on Meta’s platforms, all of which have since been removed.

Last week, the Dublin Rape Crisis Centre said that it was “deeply concerned” about the apps and the capacity of similar deepfake images to “amplify harm to women”.

However, although sharing such images (or threatening to do so) is a crime under the Harassment, Harmful Communications and Related Offences Act, also known as Coco’s Law, legal experts say the legislation does not cover their creation.

Dr Catherine O’Sullivan, an expert in Criminal Law and Criminology at University College Cork, suggested that this may have been an oversight at the time the law was drafted in 2020.

“Deepfakes didn’t really exist then to the same extent that they do now, I don’t think it was necessarily a deliberate exclusion, but it may be that the perception of harm is that the harm only occurs when the image is distributed,” she said.

O’Sullivan also said that laws which deal with images of child sexual abuse could set a precedent for banning the creation of non-consensual, explicit deepfake images. 

“When it comes to child sexual abuse imagery, it is clearly stated in the relevant legislation that it is an offence to produce images – and that includes made-up images – of that abuse. So there is precedent for it to be explicitly criminalised.” 

Michael O’Doherty, a practising barrister who has written about the law and online spaces, also said there is no law in relation to the use of somebody else’s face.

The issue of deepfaking an image of intimacy would be covered by the 2020 Act depending on how it’s used, but there’s no actual law in relation to the use of somebody’s image.

“There is nothing legally speaking to stop someone creating an image here; someone can take someone’s face and put it on someone else’s body to show them involved in a sex act, for example, but there is no law against that if it stays on your computer,” he said.

Only a handful of countries prohibit the creation of deepfake sexual images.

Last year, the British government introduced legislation that would criminalise the creation of sexually explicit “deepfake” images without consent.

If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines:

  • Dublin Rape Crisis Centre - 1800 77 8888 (fre, 24-hour helpline) 
  • Samaritans - 116 123 or email jo@samaritans.org (suicide, crisis support)
  • Pieta - 1800 247 247 or text HELP to 51444 – (suicide, self-harm)
  • Teenline - 1800 833 634 (for ages 13 to 19)
  • Childline - 1800 66 66 66 (for under 18s)

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
JournalTv
News in 60 seconds