Explicit deepfakes are traumatic. How to deal with the pain.

The fake imagery upends victims' lives, but healing is possible.
 By 
Rebecca Ruiz
 on 
An illustrated woman lies on a bed made of blank photos.
Sexually explicit deepfakes traumatize victims, but healing is possible. Credit: Stacey Zhu / Mashable

Therapist Francesca Rossi works with clients who've had real images of themselves turned into sexually explicit content, without their consent. All of her current clients identify as woman.

This type of image-based abuse, known as an explicit deepfake generated by artificial intelligence, is frequently perpetrated by a current or former intimate partner, or a known friend, coworker, or neighbor, as a form of harassment and stalking. Rossi, a licensed clinical social worker in New York, has seen her clients heal from this betrayal, but the journey is long, and rarely predictable.

In some states, creating and distributing an explicit deepfake might be against the law, but even so, local law enforcement may have few resources to investigate such cases.

The victim typically has to marshal her own response. Among her options are attempting to track down the imagery and issue takedown notices where it appears, but there's no guarantee she'll locate all of it. Rossi says explicit deepfakes are often traded between individuals, then downloaded, without the victim's knowledge.

Feeling successful one day doesn't mean the next day will be the same. The imagery may pop up on new platforms. The perpetrator may send it to the survivor's friends, family, and employer. Rossi says survivors naturally become hypervigilant. They often, impossibly, want to avoid the internet altogether. Sometimes they become fixated on monitoring imagery of themselves online, using the internet excessively to do so.

"Being victimized through deepfakes can erase your sense of reality," says Rossi, noting the dissonance survivors feel because the fake imagery looks real and convincing. "They distort your understanding of the world and everything you know to be true."

Why safety planning is critical for healing

Rossi says that people need to feel safe in order to restore their sense of reality. Creating that safety happens through measures big and small.

In the beginning, when the deepfakes are discovered, Rossi says that it's important to gather trusted loved ones who can offer emotional support, help locate where the deepfakes appear, and try to remove them, or develop a strategy for navigating this complex process, possibly in partnership with law enforcement or attorneys.

The U.S.-based Cyber Civil Rights Initiative has an image abuse helpline, along with a thorough guide for what to do once you've become a victim. In the United Kingdom, people can turn to the Revenge Porn Helpline, which aids survivors of intimate image abuse.

In addition, people may want to remove their personal information from databases maintained by data brokers, which can be done through paid services or by contacting the brokers directly. That data, including a person's home address and the names of their family members, can be used for doxxing, harassment, and stalking

Kate Keisel prioritizes physical and psychological safety planning in her work as cofounder of the New Jersey-headquartered Sanar Institute, which provides trauma-specific mental health services to survivors of interpersonal violence, including image-based sexual abuse.

Keisel says that survivors are often told by well-meaning supporters to stay off the internet when that's simply not an option for personal and professional reasons. That's why physical safety planning can include an understanding that even after initial successful takedown notices, there's no guarantee the imagery won't surface again.

Mashable Trend Report
Decode what’s viral, what’s next, and what it all means.
Sign up for Mashable’s weekly Trend Report newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

Instead of hoping that the abuse will definitively end, Keisel recommends that survivors implement boundaries related to how they spend their time online, particularly if they find it distressing not to look for images. To increase their psychological safety, survivors may want to set a limit on the number of hours they devote to searching for images of themselves.

Keisel says it can be helpful for survivors to identify and stick to tasks that feel squarely within their control, like removing their personal information from the internet.

Getting and staying grounded

While practical steps, like issuing takedown notices, are key to safety planning, both Keisel and Rossi say survivors also benefit from grounding and mindfulness practices that decrease psychological distress and anxiety.

Survivors suffer particularly because their nervous system perceives a constant threat; deepfakes, after all, have the potential to re-emerge, or may still exist online or in someone else's possession.

A therapist can teach a survivor new techniques, but Keisel says activities that bring someone into the present moment, so they can fully inhabit their body, can also be powerfully calming. These can include trauma-sensitive yoga and Tai Chi.

Rossi also recommends calming strategies that stimulate the senses, such as lighting incense or a candle, and laughing, which can reduce the body's response to fear.

"We can't think our way out of trauma," says Keisel. That's why she believes "somatic," or body-based practices, help a survivor feel safe in the present moment, even if their life has been turned upside down.

Keisel says that there will be moments when a survivor's nervous system goes into panic mode because of a new development, but that it's possible to learn skills to better tolerate that distress.

The combination of safety planning, gaining more control, and self-soothing can put someone on the path to healing, Keisel says.

There is hope

Rossi and Keisel are among several therapists and professionals in the U.S. who specialize in treating survivors of image-based sexual abuse, but their expertise is uncommon. Rossi says she has more consultation requests than she can handle; they've increased markedly since AI software and apps capable of producing explicit deepfakes became more widespread late last year.

The abuse is accelerating at a pace that lawmakers and tech companies aren't matching, though the White House recently issued a call to action for digital platforms and services to tackle the problem.

The White House's recommendations included Congressional action to strengthen legal protections for survivors of image-base sexual abuse, and provide them with critical resources.

Keisel says that those who want to talk to a therapist should consider interviewing them about their treatment practices to see if they're a right fit. Survivors might avoid therapists who don't understand image-based sexual abuse, or who aren't trained to use a trauma-sensitive approach.

But Keisel doesn't want survivors to give up on the idea of healing, even if it sometimes feels unimaginable.

"There's this idea that those of us who've experienced this level of trauma are going to be stuck in place where we can't move forward," Keisel says. "When we have the right support in place, we move past these things in life."

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

Rebecca Ruiz
Rebecca Ruiz
Senior Reporter

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Rebecca's experience prior to Mashable includes working as a staff writer, reporter, and editor at NBC News Digital and as a staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a masters degree from U.C. Berkeley's Graduate School of Journalism.


Recommended For You
John Wick fans, you need to watch the documentary trailer
Keanu Reeves plays John Wick in a photo from the set.

Congress passes ‘Take It Down’ Act to fight AI-fueled deepfake pornography
a poster board in support of the 'take it down' act

Good riddance: The web's top deepfake porn site is shutting down
woman with binary code projected on her face

AI actors and deepfakes aren't coming to YouTube ads. They're already here.
three screenshots of youtube advertisements with possible ai deepfakes

Explicit deepfakes are now a federal crime. Enforcing that may be a major problem.
A gavel made out of multicolored lines of code on a black background.

More in Life

Stock up on Duracell AA batteries while they're at a record-low price at Amazon
Duracell batteries sit in rows in front of a brown Duracell box. Behind this is a blue background with blue circles

10 best last-minute Prime Day deals to level up your home chef setup
silver immersion blender, red stand mixer, and mini food processor on blue prime day background

30+ of the best Prime Day deals for under $25: Echo, Philips, and more
Under $25 products in front of a blue background


Trending on Mashable
NYT Connections hints today: Clues, answers for July 14, 2025
Connections game on a smartphone

Wordle today: Answer, hints for July 14, 2025
Wordle game on a smartphone

NYT Strands hints, answers for July 14
A game being played on a smartphone.

NYT Connections hints today: Clues, answers for July 13, 2025
Connections game on a smartphone

Wordle today: Answer, hints for July 15, 2025
Wordle game on a smartphone
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!