This site uses cookies to improve your experience and to provide services and advertising. By continuing to browse, you agree to the use of cookies described in our Cookies Policy. You may change your settings at any time but this may impact on the functionality of the site. To learn more see our Cookies Policy.
OK
Dublin: 9 °C Friday 6 December, 2019
Advertisement

'An ideal vehicle for harassment': New revenge porn law will cover deepfake technology

A number of US states have this year moved to ban the use of this technology to create pornographic videos of individuals.

Image: Shutterstock

PROPOSED LEGISLATION TO tackle revenge porn will cover crimes in which ‘deepfake’ technology is used, the government has confirmed.

Deepfake technology is a form of artificial intelligence that involves the creation of fabricated content that appears to be real.

This can include, for example, manipulated videos that appear to show a politician or celebrity speaking about something or engaging in an activity. A number of US states have moved to ban deepfakes in a bid to curb doctored political videos and combat the use of this technology for revenge porn. 

Now the Department of Justice has said it will also seek to legislate for the use of this technology in cases of revenge porn.

Speaking at the Law Reform Commission’s annual conference in Dublin last week, Dr John Danaher of NUIG Law School, expressed concern about the harm this technology can cause in society. 

He said legislators need to consider whether the current legislation contains adequate protections from abuse uses of deepfake technology, and keep it in mind for any relevant laws it enacts in the coming years. 

Fiction and reality

Danaher referenced an example in which this technology was used to manipulate a video of former US President Barack Obama.

Source: BuzzFeedVideo/YouTube

He said there are researchers who are experimenting with creating entirely synthetic audio files, so they will not need previous audio from the person or someone else to impersonate them in audio, as seen above. 

“If you look at this video, it’s not entirely convincing, there’s something stilted and artificial about it but as the tech improves, it’s likely that even the most discerning viewers of this will find it difficult to tell the difference between fiction and reality,” he said. 

He said there is already a long history of synthetic representations of political figures, such as caricatures, but he believes there is “something pretty worrying about deep fake technology”.

“The highly realistic nature of the audiovisual material created makes this the ideal vehicle for harassment, manipulation or fraud,” Danaher said. 

He said he is most concerned about how deep fake technologies will be “weaponised to harm and intimate others, particularly members of vulnerable populations”.

It’s difficult to legislate in this area. How do you define the difference between real and synthetic media? How do you balance the free speech rights against the potential harm to others? Do we use specialised laws to deal with these problems or can existing laws on say defamation or fraud be up to the task.

“Furthermore, given that deep fakes can be created and distributed by unknown actors, who would any cause of redress be against?”

Deeptrace, an Amsterdam-based cybersecurity company, published a report this year which found 96% of the deepfake videos on the internet are pornographic videos. And 100% of the deepfake content on pornography websites contain female subjects. 

By contrast, non-pornographic deepfake videos on YouTube contained a majority of male subjects. 

Source: Deeptrace

The report quotes Professor of Law at Boston University Danielle Citron, who said this technology was being “weaponised against women by inserting their faces into porn”. 

“It is terrifying, embarrassing, demeaning and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job and feel safe. ”

The ten individuals most frequently targeted by deepfake pornography include a number of British actresses and several South Korean musicians. 

In June this year a computer app than enabled users to ‘strip’ photos of clothed women using this type of technology was launched. The volume of traffic and download requests in the first 24 hours caused the website to go offline and the creators subsequently said they would not release any further versions because “the world is not ready”. 

However, the software continues to be independently repackaged and distributed. 

Legislating

The Cabinet here has approved proposed legislation aimed at tackling the non-consensual distribution of intimate images. The Harassment, Harmful Communications and Related Offences Bill, which provides for a six-month prison sentence upon conviction, was originally put forward by the Labour Party in 2017. 

The bill is based on a Law Reform Commission report, which recommends the outlawing of two kinds of incidents: one which forbids the posting online of intimate images without consent, the other which will prevent secretly filming or photographing private areas of a person, also known as ‘upskirting’ and ‘down-blousing’.

The government will is now drafting its own amendments to the bill.

In its current form, the wording of the bill covers images that have been “altered”. Dr Danaher last week said this wording may not withstand challenges in cases where deepfake technology is used. 

“Someone might argue that synthetically constructed images are not strictly speaking altered, they are constructed and artificial.”

He said the wording should be modified to include the possibility of synthetic revenge porn. 

In July, the US state of Virginia expanded its revenge porn legislation to include fabricated or manipulated videos and images. This means it is illegal in the state to share nude images of a person, whether or not they are real. 

The state of California in recent weeks also passed two bills – one making it illegal to post manipulated videos of political figures and a second giving people the right to sue anyone who puts their image into a pornographic video. 

In response to a query from TheJournal.ie about Danaher’s suggestion, the Department of Justice said it is giving “significant consideration to the definition of an intimate image” as it prepares its amendments to the bill.

“It is intended that such a definition will encompass images or videos that have been altered or created to make it seem like an individual featured in the image or video.

“The amendments are currently being drafted by the Office of Parliamentary Counsel. Once the amendments are finalised, it is intended that they will be brought forward to Committee Stage in the Dáil.”

  • Share on Facebook
  • Email this article
  •  

Read next:

COMMENTS (15)

This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
write a comment

    Leave a commentcancel