WARNING: This story discusses graphic details of sexual cybercrime committed against young women.
Photo: Bloomberg
A new form of exploitation in the generative AI era, Levittown is a new six-part podcast series on the rise of deepfake pornography and the battle to stop it.
It shares the real story of dozens of young women who discovered photos of themselves were stolen from social media and had been manipulated and posted on a porn site without their knowledge.
Kiwi Olivia Carville is an investigative reporter for Bloomberg News and a Polk Awards winner covering technology and online child safety.
She co-hosts the podcast series along with cybercrime reporter Margi Murphy.
Carville told Saturday Morning's Susie Ferguson that AI-generated images or deepfakes have become so "convincingly real that it's almost hard to believe your own eyes."
"Individuals who are pictured themselves believe it could be them."
But what makes this explosion of technology troubling, she said, is that 90 percent of the images are pornographic, often targeting young girls.
"When I started looking into this, this issue, I kind of thought it would be quite difficult to find... but no, this is being uploaded onto very public websites."
All it takes is a few harmless photos scraped from social media.
"You literally plug a photograph of somebody into one of these applications, you hit the undress button."
The podcast examines this subject and the devastation experienced by the victims through the lens of a particularly shocking case set in the sleepy Long Island suburb of Levittown, Long Island.
It was in this picturesque community where 40 young women "discovered the horrifying realization that fake nude images of them were circulating online."
Shockingly, said Carville, throughout their investigation it appears the police were powerless to take down the content or press charges.
"The police realized that this wasn't even a crime in the US at the time. It looks like a crime, it sounds like a crime, and it sure feels like a crime for these victims."
"But there was no law on the books saying creating fake nude images of minors was illegal at the time."
This was despite the fact that in many of the uploaded images, the girls were underage at the time.
"One of the victims that we talked to... an image of her that was taken from social media and uploaded to this pornographic website from when she was five years old."
For some victims, their full names, addresses, phone numbers, and social media handles were posted with the fake explicit images.
Many were stalked. Some feared for their lives.
"The anonymous poster was calling on the hundreds if not thousands of other users of the site to harass these young women.
"Some of them started receiving social media messages, and phone calls with just heavy breathing.
"They didn't know if they were going to have someone turn up on their doorstep or call them in the middle of the night."
The pair's investigation into the Levittown case led them to New Zealand and a former cop turned private investigator who for months had been digging into the website at the centre of it all.
"He was instrumental in trying to get that website shut down."
While the series investigation ends with a prosecution, Carville made the case that protecting future victims from the growing deluge of new deepfake software will require systemic change.
"Millions of people are visiting these apps and these sites currently.... you try and shut one down, and then it pops up somewhere else.
"There has to be guardrails because of the impact that it has on victims.
"This is one of the biggest issues of our time."
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.