Advertising

New Anti-Deepfake Legislation Passes Senate Vote, Protecting Victims of Nonconsensual Digital Forgeries

Protecting Against Deepfake Pornography: The Defiance Act Takes a Stand

The fight against deepfake pornography took a significant step forward as the Disrupt Explicit Forged Images and Non-Consensual Edits (Defiance) Act passed a Senate vote with unanimous consent. This legislation, which is the first of potentially many AI-focused regulations, brings federal law one step closer to providing comprehensive protections against nonconsensual deepfake pornography.

Introduced by Senate Judiciary Chair Dick Durbin and Republican senator Lindsay Graham, the bipartisan Defiance Act grants victims the right to sue individuals who “knowingly produce, distribute, or receive” nonconsensual sexually-explicit digital forgeries. However, it is Democratic representative and co-leader Alexandria Ocasio-Cortez who has emerged as a figurehead of the legislation.

Ocasio-Cortez, who herself has been the target of synthetic forgeries, expressed the importance of the Defiance Act in a statement following the Senate hearing. She highlighted that over 90 percent of all deepfake videos made are nonconsensual sexually explicit images, with women being the targets in 9 out of 10 cases. The Defiance Act, according to Ocasio-Cortez, would ensure federal protections for survivors of nonconsensual deepfake pornography for the first time.

The urgency to address deepfake pornography is evident, as even high-profile figures like Vice President Kamala Harris have become victims of manipulated videos. The widespread circulation of a deepfake video of Harris delivering a speech that never happened on TikTok, despite being debunked multiple times, underscores the need for action. Furthermore, reports from UK watchdogs reveal that AI-created digital forgeries have contributed to the proliferation of online child sexual abuse material.

While the Defiance Act provides a civil path for remediation for those targeted in deepfakes, many victims are calling for criminal repercussions for the creators and distributors of non-consensual synthetic forgeries. However, the legislative path for criminalizing nonconsensual pornography, including deepfakes, may follow the same trajectory as real nonconsensual pornography or revenge porn. Currently, criminal liability for nonconsensual pornography falls under state law, while the federal government offers a civil path through the 2022 reauthorization of the Violence Against Women Act.

Recognizing the gravity of the situation, Senator Ted Cruz introduced the Take It Down Act, which aims to criminalize the publication of both synthetic and real non-consensual intimate imagery. The legislation also outlines penalties for tech companies that fail to remove such content within 48 hours. The White House has also taken a stance against tech players for their role in the proliferation of deepfakes.

While the Defiance Act has successfully passed the Senate vote, it is still under consideration in the House and will undergo further voting at a later date. The passage of this legislation would represent a significant milestone in the battle against nonconsensual deepfake pornography, providing victims with the legal recourse they desperately need.

In conclusion, the Defiance Act offers hope for survivors of nonconsensual deepfake pornography and sets a precedent for future AI-focused regulations. By granting victims the right to sue individuals involved in the creation and distribution of nonconsensual synthetic forgeries, this legislation aims to deter the production and dissemination of deepfakes. The continued efforts to combat deepfake pornography, both through civil and criminal channels, demonstrate a commitment to protecting individuals from the harms of this rapidly evolving technology.