5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites

0
4


Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.

Jordan Wyatt | CNBC

In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.   

Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.

As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.

“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.

CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a group of friends became key figures in the fight against nonconsensual, AI-generated porn.

Here are five takeaways from the investigation.

The women lack legal recourse

Because the women weren’t underage and the man who created the deepfakes never distributed the content, there was no apparent crime.

“He did not break any laws that we’re aware of,” said Molly Kelley, one of the Minnesota victims and a law student. “And that is problematic.”

Now, Kelley and the women are advocating for a local bill in their state, proposed by Democratic state Senator Erin Maye Quade, intended to block nudify services in Minnesota. Should the bill become law, it would levy fines on the entities enabling the creation of the deepfakes.

Maye Quade said the bill is reminiscent of laws that prohibit peeping into windows to snap explicit photos without consent.

“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said in an interview with CNBC, referring to the speed of AI development.

The harm is real

Jessica Guistolise, one of the Minnesota victims, said she continues to suffer from panic and anxiety stemming from the incident last year.

Sometimes, she said, a simple click of a camera shutter can cause her to lose her breath and begin trembling, her eyes swelling with tears. That’s what happened at a conference she attended a month after first learning about the images.

“I heard that camera click, and I was quite literally in the darkest corners of the internet,” Guistolise said. “Because I’ve seen myself doing things that are not me doing things.”

Mary Anne Franks, professor at the George Washington University Law School, compared the experience to the feelings victims describe when talking about so-called revenge porn, or the posting of a person’s sexual photos and videos online, often by a former romantic partner.

“It makes you feel like you don’t own your own body, that you’ll never be able to take back your own identity,” said Franks, who is also president of the Cyber Civil Rights Initiative, a nonprofit organization dedicated to combating online abuse and discrimination.

Deepfakes are easier to create than ever

Less than a decade ago, a person would need to be an AI expert to make explicit deepfakes. Thanks to nudifier services, all that’s required is an internet connection and a Facebook photo.

Researchers said new AI models have helped usher in a wave of nudify services. The models are often bundled within easy-to-use apps, so that people lacking technical skills can create the content.

And while nudify services can contain disclaimers about obtaining consent, it’s unclear whether there is any enforcement mechanism. Additionally, many nudify sites market themselves simply as so-called face-swapping tools.

“There are apps that present as playful and they are actually primarily meant as pornographic in purpose,” said Alexios Mantzarlis, an AI security expert at Cornell Tech. “That’s another wrinkle in this space.”

Nudify service DeepSwap is hard to find

The site that was used to create the content is called DeepSwap, and there’s not much information about it online.

In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote from Penyne Wu, who was identified in the release as CEO and co-founder. The media contact on the release was Shawn Banks, who was listed as marketing manager. 

CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.

DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong. 

AI’s collateral damage

Maye Quade’s bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.

Some experts are concerned, however, that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts.

In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.” 

Kelley hopes that any federal AI push doesn’t jeopardize the efforts of the Minnesota women.

“I’m concerned that we will continue to be left behind and sacrificed at the altar of trying to have some geopolitical race for powerful AI,” Kelley said.

WATCH: The alarming rise of AI ‘nudify’ apps that create explicit images of real people.


LEAVE A REPLY

Please enter your comment!
Please enter your name here