Pornographic deepfakes could soon become a crime under a proposed law. Photo: RNZ
Explainer - They have been described as a sadistic and degrading weapon - yet in New Zealand, nude deepfakes are not explicitly a crime.
NetSafe has received hundreds of complaints of non-consensual sexually explicit deepfakes this year alone, but only one deepfake porn prosecution has ever been made in New Zealand.
Legal experts say that's because the law does not adequately protect victims from non-consensual deepfakes.
ACT MP Laura McClure seeks to change that.
Her member's bill to criminalise non-consensual sexually explicit deepfakes has been pulled from the ballot.
ACT MP Laura McClure holds up a faked nude photo of herself that she created when discussing the Deepfake Digital Harm and Exploitation Bill. Photo: Facebook / Laura McClure
What does the proposed law say?
The Deepfake Digital Harm and Exploitation Bill looks to close a loophole by amending existing laws to expand the definition of an 'intimate visual recording'.
It would widen what a 'recording' is to include images or videos that are created, synthesised, or altered to depict a person's likeness in intimate contexts without their consent.
It would also cover the sharing of a non-consensual pornographic deepfake.
Currently, these types of deepfakes could be an offence under the Harmful Digital Communications Act.
However, critics say the law only covers instances where the offender's intention to cause harm is proven, which can be challenging.
Earlier this month, New Zealand made what was believed to be the first deepfake porn prosecution. In this case, the victim and her dad managed to track down the perpetrator and get a confession.
McClure said it is generally "next to impossible" for victims to get such damning evidence.
"So far, other victims of this kind of offending have not been successful in getting any convictions because the current act doesn't include AI synthetics/deepfakes," she said.
Kiwi sports presenter and broadcaster Tiffany Salmond. Photo: Supplied
What are victims saying?
Kiwi sports presenter and broadcaster Tiffany Salmond said explicit deepfakes were a "weapon used to degrade and humiliate".
Salmond first became aware of deepfakes of her circulating about five months ago when a friend sent it to her.
"At first, I was shocked. I hadn't seen anything like it before, and the videos were confronting. It was surreal to see my own face making unpleasant expressions, moving in ways I never had, undressing into full nudity," she told RNZ.
Salmond, who now lives in Australia, said she has become desensitised to perverse commentary about herself online.
"I didn't feel ashamed or embarrassed by what they'd done, because I could see straight through their attempt to humiliate me, and it didn't work," she said.
"Instead, I held a mirror up to them, because if anyone should feel embarrassed, it's them. So, I used my platform to make that clear. Not just for me, but for every woman who's been targeted by this new form of digital violence."
However, she has found herself second-guessing what photos she shares online.
"I worried that if I continued sharing photos the way I always had, people might assume I wasn't taking it seriously, or worse, that I was inviting it. That was an awful feeling to suddenly feel shame around something that had never been shameful before."
McClure said these types of deepfakes are having long-term effects on victims, with some dropping out of school or university as a result. One victim told her she had made an attempt on her life after having deepfakes created and shared of her.
"What victims are saying is this is seriously harmful. It's not just a joke, it's not a photoshop of your head onto a body. It's really realistic and often really degrading," McClure said.
"It's not just nudifying someone, it's often putting them into quite sadistic pornography."
Countries around the world are wrestling with how to deal with deepfakes. Photo: 123rf
What are other countries doing?
Canterbury University senior law lecturer and online abuse expert Dr Cassandra Mudgway said New Zealand's law is behind many other countries when it comes to deepfakes.
In the UK, it is illegal to create, share or threaten to share intimate photos without consent, including deepfakes.
Earlier this year, Denmark gave people copyright to their own features in an effort to clamp down on deepfakes.
Australia has federal laws that make non-consensual sexually explicit deepfakes an offence. It is also moving to ban online tools being used to create AI-generated nude images.
Salmond, who lives in Sydney, said she was pleased to see laws pass in New South Wales in August to outlaw sexually explicit deepfakes.
"This new law means that schoolgirls being targeted by boys they know, or women being harassed by exes or disgruntled admirers, finally have the law on their side and can take action," she said.
Salmond believes non-consensual sexually explicit deepfakes should be a criminal offence in New Zealand and said McClure's bill is a positive step.
How would deepfakes be policed?
Mudgway said there are currently additional hurdles for victims of sexually explicit deepfakes.
"We are treating fakes differently to real images," she said.
"We are suggesting that a real sexualised image that's shared without consent, we are going to prosecute it in a way that is straightforward for a victim, they don't have to offer evidence of serious emotional distress, we assume that the harm has already been done. The only concern was whether it was shared without consent.
"Whereas if it's a deepfake, not real, you've suddenly got all of these additional hurdles. The bill seeks to redress that."
McClure said due to this, often police aren't pursuing deepfake cases.
Police also do not track the number of offences using deepfake technology, as it does not have an offence code.
McClure said by making the law include non-consensual nude deepfakes, it gives the police more power to investigate and prosecute.
She believes it would work similarly to our revenge porn laws. These laws were amended in 2022 to make it easier to prosecute by not requiring proof that there was intent to cause harm, but rather requiring the perpetrator to prove they had permission to publish.
But what if they were made by minors?
McClure said many victims she spoke to were young women who had deepfakes made of them by their peers.
ACT Party MP Laura McClure. Photo: Supplied
Many of these perpetrators cannot be charged and prosecuted as an adult.
Instead, McClure said they can be referred to the Youth Court, which has tools such as rehabilitation and restorative justice available to them.
"We don't really want our young people being convicted of crimes… I think for the most part, a lot of them don't realise the harm that it is causing."
By making non-consensual deepfake pornography a criminal offence, it sends a message to young people that this is not okay, McClure said.
"Everybody now knows if someone sends a nude in a relationship, you can never send it on to your mates," McClure said.
"I hope that this will be the same kind of thing. It will set the standard. Young people will know that this is really serious, it is a form of abuse, and we consider it to be illegal."
Does the bill go far enough?
Experts say the government needs to be doing more to manage the risks of AI.
An open letter signed by more than 20 AI experts is calling on the government to better regulate the new technology.
It wants a bipartisan approach to regulate AI and the establishment of a national AI oversight body.
Mudgway, who was a co-author of the letter, said we need a future-proof approach to AI.
"It's a tool that is readily available to anyone. I mean, why not use it if it's legal, right? Why not use it if it's so easily available? Which is our issue about the absence of regulation."
Another of the letter's co-authors, Victoria University AI senior lecturer Dr Andrew Lensen, said the bill was a good first step, but urged the government to do more.
Lensen said there have been big advances in AI, and it is becoming readily available for users.
He said that while we shouldn't outright ban AI, there needs to be a wider conversation about how we regulate it.
How to get help
If you have a copy of an intimate image that has been shared (or you fear might be) you can contact StopNCII.org.
If you are under 18 and your nude, partial nude or sexually explicit photo has been shared you can use a free tool called Take It Down to help remove and stop the sharing of your images.
- Need to Talk? Free call or text 1737 any time to speak to a trained counsellor, for any reason.
- Lifeline: 0800 543 354 or text HELP to 4357.
- Suicide Crisis Helpline: 0508 828 865 / 0508 TAUTOKO. This is a service for people who may be thinking about suicide, or those who are concerned about family or friends.
- Depression Helpline: 0800 111 757 or text 4202.
- Samaritans: 0800 726 666.
- Youthline: 0800 376 633 or text 234 or email talk@youthline.co.nz.
- What's Up: 0800 WHATSUP / 0800 9428 787. This is free counselling for 5 to 19-year-olds.
- Asian Family Services: 0800 862 342 or text 832. Languages spoken: Mandarin, Cantonese, Korean, Vietnamese, Thai, Japanese, Hindi, Gujarati, Marathi, and English.
- Rural Support Trust Helpline: 0800 787 254.
- Healthline: 0800 611 116.
- Rainbow Youth: (09) 376 4155.
- OUTLine: 0800 688 5463.
If it is an emergency and you feel like you or someone else is at risk, call 111.
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.