10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. Indiana is likely to soon join the growing list by expanding10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. Indiana is likely to soon join the growing list by expanding

Nonconsensual AI Porn Will Make Victims Out of All Of Us: Can The Law Keep Up?

This article was copublished with The 19th, a nonprofit newsroom covering gender, politics, and policy. Sign up for The 19th’s newsletter here.

\ More than two dozen students at Westfield High School in New Jersey were horrified last year to learn that naked images of them were circulating among their peers. According to the school, some students had used Artificial Intelligence (AI) to create pornographic images of others from original photos. And they’re not the only teenage girls being victimized by fake nude photos: Students in Washington State and Canada have also reported facing similar situations as the ability to realistically alter photos becomes more broadly accessible with websites and apps.

\ The growing alarm around deepfakes—AI-generated images or videos—in general was amplified even further in January, as one involving the superstar Taylor Swift spread quickly through social media.

\ Carrie Goldberg, a lawyer who has been representing victims of nonconsensual porn—commonly referred to as revenge porn—for more than a decade, said she only started hearing from victims of computer-generated images more recently.

\ “My firm has been seeing victims of deepfakes for probably about five years now, and it’s mostly been celebrities,” Goldberg said. “Now, it’s becoming children doing it to children to be mean. It’s probably really underreported because victims might not know that there’s legal recourse, and it’s not entirely clear in all cases whether there is.”

\ Governing bodies are trying to catch up. In the past year or so, 10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. These states—including California, Florida, Georgia, Hawaii, Illinois, Minnesota, New York, South Dakota, Texas and Virginia—outlined penalties ranging from fines to jail time. Indiana is likely to soon join the growing list by expanding its current law on nonconsensual porn.

\ Indiana Rep. Sharon Negele, a Republican, authored the proposed expansion. The existing law defines “revenge porn” as disclosing an intimate image, such as any that depict sexual intercourse, uncovered genitals, buttocks or a woman’s breast, without the consent of the individual depicted in the image. Negele’s proposed bill passed through both chambers and is now awaiting the governor’s signature.

\ Negele said she was motivated to update Indiana’s criminal code when she heard the story of a high school teacher who discovered that some of her students had disseminated deepfake images of her. It was “incredibly destructive” to the teacher’s personal life, and Negele was surprised to see that the perpetrators could not be prosecuted under current law.

\ “It started with my education of understanding the technology that is now available and reading about incident after incident of people’s faces being attached to a made up body that looks incredibly real and realistic,” Negele said. “It’s just distressing. Being a mom and a grandmother and thinking about what could happen to my family and myself—it’s shocking. We’ve got to get ahead of this kind of stuff.”

\ Goldberg, whose law firm specializes in sex crimes, said she anticipates more states will continue expanding their existing legislation to include AI language.

\ “Ten years ago, only three states had revenge porn or image-based sexual abuse laws,” Goldberg said. “Now, 48 states have outlawed revenge porn, and it has really created a tremendous reduction in revenge porn—not surprisingly—just as we advocates had said it would. The whole rise of deepfakes has filled in the gaps as being a new to way to sexually humiliate somebody.”

\ In 2023, more than 143,000 new AI-generated videos were posted online, according to The Associated Press. That’s a huge jump from 2019, when the “nudify” websites or applications were less commonplace, and still there were nearly 15,000 of these fake videos online, according to a report from Deeptrace Labs, a visual threat intelligence company. Even back then, those videos—96 percent of which had nonconsensual pornography of women—had garnered over 100 million views.

\ Goldberg said policymakers and the public alike seem to be more motivated to ban AI-generated nude images specifically because virtually anyone can be a victim. There’s more empathy.

\ “With revenge porn, in the first wave of discussions, everyone was blaming the victim and making them seem like they were some sort of pervert for taking the image or stupid for sharing it with another person,” Goldberg said. “With deepfakes, you can’t really blame the victim because the only thing they did was have a body.”

\ Amanda Manyame, a South Africa-based digital rights advisor for Equality Now, an international human rights organization focused on helping women and girls, said that there are virtually no protections for victims of deepfakes in the United States. Manyame studies policies and laws around the world, analyzes what’s working and provides legal advice around digital rights, particularly on tech-faciliated sexual exploitation and abuse.

\ “The biggest gap is that the U.S. doesn’t have federal law,” Manyame said. “The challenge is that the issue is governed state by state, and naturally, there’s no uniformity or coordination when it comes to protections.”

\ There is, however, currently a push on Capitol Hill: A bipartisan group of senators introduced in January the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024—also known as the DEFIANCE Act. The proposed legislation aims to stop the proliferation of nonconsensual, sexually-explicit content.

\ “Nobody—neither celebrities nor ordinary Americans—should ever have to find themselves featured in AI pornography,” Republican Sen. Josh Hawley, a co-sponsor of the bill, said in a statement. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court.” Rep. Alexandria Ocasio-Cortez has introduced a partner bill in the House.

\ According to new polling from Data for Progress, 85 percent of likely voters across the political spectrum said they support the proposed DEFIANCE Act—with 72 percent of women in strong support compared to 62 percent of men.

\ But younger men are more likely to oppose the DEFIANCE Act, with about one in five men under 45 (22 percent) saying they strongly or somewhat oppose legislation allowing subjects of explicit nonconsensual deepfakes to sue the creator.

\ Danielle Deiseroth, executive director of Data for Progress, said this issue showed one of the “more sharp contrasts” between young men and women that she’s seen in awhile.

\ “We can confidently say that women and men under 45 have diverging opinions on this policy,” Deiseroth said. “This is an issue that disproportionately impacts women, especially young women, who are more likely to be victims of revenge porn. And I think that’s really the root cause here.”

\ Goldberg said that creating policies to criminalize bad actors is a good start but is ultimately insufficient. A good next step, she said, would be to take legal action targeting the online distributors, like the App Store and Google Play, that are providing products primarily used for criminal activities. Social media platforms and instant messaging apps, where these explicit images are distributed, should also be held accountable, Goldberg added.

\ The founders of #MyImageMyChoice, a grassroots organization working to help victims of intimate image abuse, agreed that more should be done by private companies involved in the creation and distribution of these images.

\ The founders—Sophie Compton, Reuben Hamlyn and Elizabeth Woodward—pointed out that search engines like Google drive most of the total web traffic to deepfake porn sites, while credit card companies process their payments. Internet service providers let people access them, while major services like Amazon, Cloudflare, and Microsoft’s Github host them. And social media sites like X allow the content to circulate at scale. Google changed its policy in 2015 and started allowing victims to submit a request to remove individual pieces of content from search results and has since expanded the policy to deepfake abuse. However, the company does not systematically delist image-based sexual violence and deepfake abuse sites.

\ “Tech companies have the power to block, de-index or refuse service to these sites—sites whose entire existence is built on violating consent and profiting from trauma,” Compton, Hamlyn and Woodward said in a statement to The 19th. “But they have chosen not to.”

\ Goldberg pointed to the speed at which the Taylor Swift deepfakes spread. One image shared on X, formerly known as Twitter, was viewed 47 million times before the account that posted it was suspended. Images continued to spread despite efforts from the social media companies to remove them.

\ “The violent, misogynistic pictures of Taylor Swift, bloody and naked at a Kansas City Chiefs football game, is emblematic of the problem,” Goldberg said. “The extent of that distribution, including on really mainstream sites, sends a message to everybody that it’s okay to create this content. To me, that was a really pivotal and quite frightening moment.”

\ Given the high profile nature of the victim, the incident sparked pronounced and widespread outrage from Swift’s fans and brought public attention to the issue. Goldberg said she checked to see whether any of the online distributors had removed products from their online stores that make it easier and cheaper to create sexually explicit deepfakes—and she was relieved to see they had.

\ As the country’s policymakers and courts continue to try to respond to the quickly developing and increasingly accessible technology, Goldberg said it’s important that lawmakers continue deferring to experts and those who work directly with victims, such as lawyers, social workers and advocates. Lawmakers who are regulating abstract ideas or rapidly advancing technologies can be a “recipe for disaster” otherwise, she added.

\ Manyame also emphasized the importance in speaking directly to survivors when making policy decisions, but added that lawmakers also need to be thinking more holistically about the problem and not become too bogged down by the specific technology—at the risk of always being behind. For example, Manyame said the general public is only now beginning to understand the risks posed by AI and deepfakes—something she helped write a report back in 2021. Looking ahead, Manyame is already thinking about the Metaverse—a virtual reality space—where users are starting to reckon with instances of rape, sexual harassment and abuse.

\ “A lot of the laws around image-based sexual abuse are a little bit dated because they speak about revenge porn specifically,” Manyame said. “Revenge porn has historically been more of a domestic violence issue, in that it is an intimate partner sharing a sexually exploitative image of their former or existing partner. That’s not always the case with deepfakes, so these laws might not provide enough protections.”

\ In addition, Manyame argued that many of these policies fail to broaden the definition of “intimate image” to consider diverse cultural or religious backgrounds. For some Muslim women, for instance, it might be just as violating and humiliating to create and disseminate images of their uncovered head without a hijab.

\ When it comes to solutions, Manyame pointed to actions that can be taken by the app creators, platform regulators and lawmakers.

\ At the design phase, more safety measures can be embedded to limit harm. For example, Manyame said there are some apps that can take photos of women and automatically remove their clothing while that same function doesn’t work on photos of men. There are ways on the back end of these apps to make it harder to remove clothes from anyone, regardless of their gender.

\ Once the nefarious deepfakes are already created and posted, however, Manyame said the social media and messaging platforms should have better mechanisms in place to remove the content after victims report it. Many times, individual victims are ignored. Manyame said she’s noticed these large social media companies are more likely to remove these deepfakes in countries, such as Australia, that have third-party regulators to advocate on behalf of victims.

\ “There needs to be monitoring and enforcement mechanisms included in any solution,” Manyame said. “One of the things that we hear from a lot of survivors is they just want their image to be taken down. It’s not even about going through a legal process. They just want that content gone.”

\ Manyame said it’s not too big of an ask for many tech companies and government regulators because many already respond quickly to remove inappropriate photos involving children. It’s just a matter of extending those kinds of protections to women, she added.

\ “My concern is that there’s been a rush to implement A.I. laws and policies without considering what some of the root causes of these harms are. It’s a layered problem, and there’s many other layers that need to be tackled.”


Credits

  • Mariel Padilla for The 19th

Illustration

  • Rena Li

Editing

  • Flora Peir for The 19th

\ Also published here

\ Photo by Steve Johnson on Unsplash

Market Opportunity
Sleepless AI Logo
Sleepless AI Price(AI)
$0.03647
$0.03647$0.03647
+2.18%
USD
Sleepless AI (AI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Wormhole launches reserve tying protocol revenue to token

Wormhole launches reserve tying protocol revenue to token

The post Wormhole launches reserve tying protocol revenue to token appeared on BitcoinEthereumNews.com. Wormhole is changing how its W token works by creating a new reserve designed to hold value for the long term. Announced on Wednesday, the Wormhole Reserve will collect onchain and offchain revenues and other value generated across the protocol and its applications (including Portal) and accumulate them into W, locking the tokens within the reserve. The reserve is part of a broader update called W 2.0. Other changes include a 4% targeted base yield for tokenholders who stake and take part in governance. While staking rewards will vary, Wormhole said active users of ecosystem apps can earn boosted yields through features like Portal Earn. The team stressed that no new tokens are being minted; rewards come from existing supply and protocol revenues, keeping the cap fixed at 10 billion. Wormhole is also overhauling its token release schedule. Instead of releasing large amounts of W at once under the old “cliff” model, the network will shift to steady, bi-weekly unlocks starting October 3, 2025. The aim is to avoid sharp periods of selling pressure and create a more predictable environment for investors. Lockups for some groups, including validators and investors, will extend an additional six months, until October 2028. Core contributor tokens remain under longer contractual time locks. Wormhole launched in 2020 as a cross-chain bridge and now connects more than 40 blockchains. The W token powers governance and staking, with a capped supply of 10 billion. By redirecting fees and revenues into the new reserve, Wormhole is betting that its token can maintain value as demand for moving assets and data between chains grows. This is a developing story. This article was generated with the assistance of AI and reviewed by editor Jeffrey Albus before publication. Get the news in your inbox. Explore Blockworks newsletters: Source: https://blockworks.co/news/wormhole-launches-reserve
Share
BitcoinEthereumNews2025/09/18 01:55
Top Altcoins To Hold Before 2026 For Maximum ROI – One Is Under $1!

Top Altcoins To Hold Before 2026 For Maximum ROI – One Is Under $1!

BlockchainFX presale surges past $7.5M at $0.024 per token with 500x ROI potential, staking rewards, and BLOCK30 bonus still live — top altcoin to hold before 2026.
Share
Blockchainreporter2025/09/18 01:16
Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council

Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council

The post Best Crypto to Buy as Saylor & Crypto Execs Meet in US Treasury Council appeared on BitcoinEthereumNews.com. Michael Saylor and a group of crypto executives met in Washington, D.C. yesterday to push for the Strategic Bitcoin Reserve Bill (the BITCOIN Act), which would see the U.S. acquire up to 1M $BTC over five years. With Bitcoin being positioned yet again as a cornerstone of national monetary policy, many investors are turning their eyes to projects that lean into this narrative – altcoins, meme coins, and presales that could ride on the same wave. Read on for three of the best crypto projects that seem especially well‐suited to benefit from this macro shift:  Bitcoin Hyper, Best Wallet Token, and Remittix. These projects stand out for having a strong use case and high adoption potential, especially given the push for a U.S. Bitcoin reserve.   Why the Bitcoin Reserve Bill Matters for Crypto Markets The strategic Bitcoin Reserve Bill could mark a turning point for the U.S. approach to digital assets. The proposal would see America build a long-term Bitcoin reserve by acquiring up to one million $BTC over five years. To make this happen, lawmakers are exploring creative funding methods such as revaluing old gold certificates. The plan also leans on confiscated Bitcoin already held by the government, worth an estimated $15–20B. This isn’t just a headline for policy wonks. It signals that Bitcoin is moving from the margins into the core of financial strategy. Industry figures like Michael Saylor, Senator Cynthia Lummis, and Marathon Digital’s Fred Thiel are all backing the bill. They see Bitcoin not just as an investment, but as a hedge against systemic risks. For the wider crypto market, this opens the door for projects tied to Bitcoin and the infrastructure that supports it. 1. Bitcoin Hyper ($HYPER) – Turning Bitcoin Into More Than Just Digital Gold The U.S. may soon treat Bitcoin as…
Share
BitcoinEthereumNews2025/09/18 00:27