Explicit deepfakes are now a federal crime. Enforcing that may be a major problem.

A law that's both narrow and broad may prove too difficult to follow.
 By 
Chase DiBenedetto
 on 
A gavel made out of multicolored lines of code on a black background.
Experts have mixed feelings on whether the Take It Down Act will live up to its promises. Credit: the-lightwriter / iStock / Getty Images Plus via Getty Images

On May 19, President Donald Trump and First Lady Melania Trump beamed to press and allies as they signed the administration's first major piece of tech regulation, the bipartisan Take It Down Act.

It was seen as a win for those who have long been calling on the criminalization of NDII, or the nonconsensual distribution of intimate images, and a federal pathway of redress for victims. Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance, explained it may be a needed kick in the pants to a lethargic legislative arena.

"I think it's good that they're going to force social media companies to have a process in place to remove content that people ask to be removed," he said. "This is kind of a start; to build the infrastructure to be able to respond to this type of request, and it's a really thin slice of what the issues with AI are going to be."

But other digital rights groups say the legislation may stir false hope for swift legal resolutions among victims, with unclear vetting procedures and an overly broad list of applicable content. The law's implementation is just as murky.

The act's notice and takedown provision could pose major problems 

"The Take It Down Act’s removal provision has been presented as a virtual guarantee to victims that nonconsensual intimate visual depictions of them will be removed from websites and online services within 48 hours," said the Cyber Civil Rights Initiative (CCRI) in a statement. "But given the lack of any safeguards against false reports, the arbitrarily selective definition of covered platforms, and the broad enforcement discretion given to the FTC with no avenue for individual redress and vindication, this is an unrealistic promise." 

Exacerbating free speech and content moderation concerns

These same digital rights activists, who had issued warnings throughout the bill's congressional journey, will also be keeping a close eye on how the act may affect constitutionally protected speech, with the fear that publishers may remove legal speech to preempt criminal repercussions (or flatly suppress free speech, such as consensual LGBTQ pornography). Some worry that the bill's takedown system, modeled after the Digital Millennium Copyright Act (DMCA), may over-inflate the power of the Federal Trade Commission, which now has the power to hold online content publishers accountable to the law with unlimited jurisdiction. 

"Now that the Take It Down Act has passed, imperfect as it is, the Federal Trade Commission and platforms need to both meet the bill’s best intentions for victims while also respecting the privacy and free expression rights of all users," said Becca Branum, deputy director of the Center for Democracy & Technology (CDT)'s Free Expression Project. "The constitutional flaws in the Take It Down Act do not alleviate the FTC's obligations under the First Amendment."

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

A lack of government infrastructure

Organizations like the CCRI and the CDT had spent months lobbying legislatures to adjust the act's enforcement provisions. The CCRI, which penned the bill framework that Take It Down is based on, has taken issue with the legislation's exceptions for images posted by someone that appears in them, for example. They also fear the removal process may be rife for abuse, including false reports made by disgruntled individuals or politically-motivated groups under an overly broad scope for takedowns. 

The CDT, conversely, interprets the law's AI-specific provisions as too specific. "Take It Down’s criminal prohibition and the takedown system focus only on AI generated images that would cause a 'reasonable person [to] believe the individual is actually depicted in the intimate visual depiction.' In doing so, the Take It Down Act is unduly narrow, missing several instances where perpetrators could harm victims," the organization argues. For example, a defendant could reasonably get around the law by publishing synthetic likenesses placed in implausible or fantastical environments. 

Just as confusing is that while the FTC's takedown authority for applicable publishers is vast, its oversight is exempt for others, such as sites that don't host user-generated synthetic content, but rather their own, curated content. Instead of being forced to take down media under the 48-hour stipulation, these sites can only be pursued in a criminal case. "Law enforcement, however, has historically neglected crimes disproportionately perpetrated against women and may not have the capacity to prosecute all such operators," the CDT warns. 

Steinhauer theorizes that the bill may face a general infrastructure problem in its early enforcement. For example, publishers may find it difficult to corroborate that the individuals filing claims are actually depicted in the NDII within the 48 hour period, unless they beef up their own oversight investments — most social media platforms have scaled back their moderation processes in recent years. Automatic moderation tools could help, but they're known to have their own set of issues

No cohesion on AI regulation

There's also the question of how publishers will spot and prove that images and videos are synthetically generated, specifically, a problem that's plagued the industry as generative AI has grown. "The Take It Down Act effectively increases the liability for content publishers, and now the onus is on them to be able to prove that the content they’re publishing is not a deepfake," Manny Ahmed, founder and CEO of content provenance company OpenOrigins. "One of the issues with synthetic media and having provable deniability is that detection doesn’t work anymore. Running a deepfake detector post hoc doesn’t give you a lot of confidence because these detectors can be faked or fooled pretty easily and existing media pipelines don't have any audit trail functionality built into them.”

It's easy to follow the logic of such a strong takedown tool being used as a weapon of censorship and surveillance, especially under an administration that is already doing plenty to sow distrust among its citizens and wage war on ideological grounds.

Steinhauer still urges an open mind. "This is going to open a door to those other conversations and hopefully reasonable regulation that is a compromise for everyone," he said. "There's no world we should live in where somebody can fake a sexual video of someone and not be held accountable. We have to find a balance between protecting people, and protecting people's rights."

The future of broader AI regulation remains in question, however. Through Trump championed and signed the Take It Down Act, he and congressional Republicans also pushed to include a 10-year ban on state- and local-level AI regulation in their touted One Big Beautiful Bill

And even with the president's signature, the future of the law is uncertain, with rights organizations predicting that the legislation may be contested in court on free speech grounds. "There's plenty of non pornographic or sexual material that could be created with your likeness, and right now there's no law against it," added Steinhauer. Regardless of whether Take It Down remains or gets the boot, the issue of AI regulation is far from settled.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also captures how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
AI actors and deepfakes aren't coming to YouTube ads. They're already here.
three screenshots of youtube advertisements with possible ai deepfakes

It's now a federal crime to publish AI deepfake porn
President Donald Trump and Melania Trump sit at a desk and sign the Take It Down Act while surrounded by people clapping.

Federal judge: A social media ban for kids under 14 is unconstitutional
folding, stylized phones and notifications in an illustration

Congress passes ‘Take It Down’ Act to fight AI-fueled deepfake pornography
a poster board in support of the 'take it down' act

Good riddance: The web's top deepfake porn site is shutting down
woman with binary code projected on her face

More in Tech
Here’s what Nvidia has to say about Rowhammer and whether you should worry
Nvidia headquarters


Snag a 27-inch Samsung Odyssey OLED G6 for its lowest price to date
The Samsung Odyssey OLED G6 gaming monitor is shown on its stand against a textured gray background. The screen displays a vibrant gaming scene, with logos indicating its 27-inch QHD display, 240Hz refresh rate, and FreeSync Premium Pro technology.


Google is merging Android with ChromeOS
Google Android

Trending on Mashable
NYT Connections hints today: Clues, answers for July 14, 2025
Connections game on a smartphone

Wordle today: Answer, hints for July 14, 2025
Wordle game on a smartphone

NYT Strands hints, answers for July 14
A game being played on a smartphone.

Wordle today: Answer, hints for July 15, 2025
Wordle game on a smartphone

Wordle today: Answer, hints for July 13, 2025
Wordle game on a smartphone
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!