Lawmakers on Capitol Hill are scrambling to address the boom in deepfake AI pornographic images, which have targeted everyone from celebrities to high school students.
A significant surge in the creation and dissemination of deepfake AI-generated pornographic images has prompted urgent legislative action. These digitally manipulated images have not only invaded the privacy of public figures but have also been used to harass and humiliate ordinary individuals, including minors. The proliferation of these images presents severe ethical, legal, and social challenges that require immediate attention.
The Take It Down Act: A New Legislative Proposal
Senator Ted Cruz (R-TX) is at the forefront of this legislative effort. As the primary sponsor of the “Take It Down Act,” Cruz aims to impose strict regulations on social media companies to ensure the rapid removal of deepfake pornography.
Key Provisions of the Bill
Accountability for Social Media Platforms
– The bill mandates that social media platforms develop a robust process for removing deepfake pornographic images within 48 hours of receiving a valid request from the victim. This provision aims to provide swift relief to individuals whose images have been maliciously altered and shared.
– Platforms must also make reasonable efforts to remove all other copies of the offending images, including those shared in private groups. This comprehensive approach seeks to prevent the further spread of harmful content across the internet.
Criminalization of Deepfake Porn
– Publishing or threatening to publish deepfake pornographic images would be criminalized under this legislation. This aims to deter individuals from creating and distributing such content by imposing significant legal consequences.
Enforcement by the Federal Trade Commission
The Federal Trade Commission (FTC) would be responsible for enforcing these new regulations. The FTC, which already oversees consumer protection rules, would ensure that social media platforms comply with the requirements to remove deepfake content swiftly and thoroughly.
Bipartisan Support and Victims’ Advocacy
The introduction of the Take It Down Act is supported by a bipartisan group of senators, reflecting a broad consensus on the urgency of addressing deepfake pornography. These senators will be joined at the Capitol by victims of deepfake porn, including high school students, who will share their personal experiences and highlight the devastating impact of these malicious images.
Impact on Celebrities and Public Figures
The rise of non-consensual AI-generated images has significantly affected high-profile individuals. Celebrities like Taylor Swift and politicians like Rep. Alexandria Ocasio-Cortez (D-N.Y.) have been targeted, underscoring the widespread nature of this digital abuse.
“By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime,” Cruz stated.
Competing Legislative Efforts
Despite widespread recognition of the problem, there is no unified approach to tackling deepfake AI pornography. Two competing bills are currently under consideration in the Senate.
Sen. Dick Durbin’s Bill
Senator Dick Durbin (D-Ill.) introduced a bipartisan bill earlier this year that allows victims to sue individuals who create, possess, or distribute non-consensual deepfake images. This approach focuses on providing a legal recourse for victims directly against those responsible for creating and spreading the harmful content.
Key Differences
Under Cruz’s bill, the onus is on social media companies to moderate and remove deepfake content, treating it like other forms of extremely offensive online material. Durbin’s bill, however, faced opposition for being “overly broad in scope” and potentially stifling technological innovation, according to Sen. Cynthia Lummis (R-Wyo.).
Lummis, an original co-sponsor of Cruz’s bill, is joined by other senators from both parties, including Shelley Moore Capito (R-W.Va.), Amy Klobuchar (D-Minn.), Richard Blumenthal (D-Conn.), and Jacky Rosen (D-Nev.).
Moving Forward with AI Legislation
The introduction of the Take It Down Act coincides with a broader push by Senate Majority Leader Chuck Schumer (D-N.Y.) to address AI-related issues. A recently released task force “roadmap” emphasized the need for legislation to tackle the nonconsensual distribution of intimate images and other harmful deepfakes.
As deepfake technology continues to evolve, the legislative response will need to be swift and comprehensive to protect individuals from the malicious misuse of AI-generated content. The bipartisan nature of the Take It Down Act and the support from various senators reflect a critical step towards ensuring that victims of deepfake pornography have a robust legal framework for protection and redress.
The coming months will be crucial as lawmakers debate and refine these legislative proposals to create an effective and balanced approach to combating the harmful effects of deepfake technology.