Advertising

New Bill Aims to Protect Artists and Journalists from AI Content Misuse

Protecting Content Origin and Combating Deepfakes: The COPIED Act

A new bill has been introduced by a bipartisan group of senators, aiming to safeguard the rights of artists, songwriters, and journalists in the face of AI advancements. The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) seeks to prevent the unauthorized use of content for training AI models or generating AI content without the consent of the creators. The bill also aims to address the growing concern surrounding harmful deepfakes.

Authored by Senate Commerce Committee Chair Maria Cantwell, Senate AI Working Group member Martin Heinrich, and Commerce Committee member Marsha Blackburn, the COPIED Act puts forward significant measures to protect content owners. One key provision requires companies developing AI tools to allow users to attach content provenance information to their work within two years. This machine-readable information documents the origin of digital content, such as photos and news articles. Once content has provenance information attached, it cannot be used to train AI models or generate AI content.

By enabling content owners to protect their work and set the terms of use, including compensation, the COPIED Act empowers creators. It also grants them the right to take legal action against platforms that use their content without permission or tamper with content provenance information. This legislation aims to restore control to local journalists, artists, musicians, and other creators.

To ensure effective implementation of the COPIED Act, it proposes that the National Institute of Standards and Technology (NIST) establish guidelines and standards for content provenance information, watermarking, and synthetic content detection. These standards would play a crucial role in determining whether content has been generated or altered by AI and identifying the origin of AI-generated content.

Senator Cantwell emphasizes the importance of transparency surrounding AI-generated content. She believes that the COPIED Act will provide much-needed clarity while also protecting creators. The bill has gained support from various artists’ groups, including SAG-AFTRA, National Music Publishers’ Association, The Seattle Times, Songwriters Guild of America, and Artist Rights Alliance.

The introduction of the COPIED Act reflects a broader trend of lawmakers seeking to regulate AI. Senator Ted Cruz recently introduced the Take It Down Act, which holds social media companies accountable for removing and policing deepfake porn. This bill addresses the issue of AI-generated pornographic photos of celebrities circulating on platforms like X and Instagram.

In addition to these legislative efforts, Senate Majority Leader Chuck Schumer proposed a comprehensive roadmap for addressing AI. The roadmap includes increased funding for AI innovation, measures to combat deepfakes in elections, and leveraging AI for national security.

The growing attention to AI regulation is not limited to the federal level. State legislatures have been introducing an average of 50 AI-related bills per week, according to Axios. As of February, over 40 states had introduced a total of 407 AI-related bills, marking a significant increase from the previous year.

Recognizing the potential risks associated with AI, President Joe Biden issued an executive order last October to establish standards for AI safety and security. These standards would require developers to share safety test results and critical information with the government before deploying their AI systems. However, it is worth noting that former President Donald Trump expressed intentions to repeal this executive order if re-elected.

In conclusion, the COPIED Act represents a crucial step towards protecting content origin and combating the rise of harmful deepfakes. By giving creators control over their work and establishing guidelines for content provenance information, this legislation aims to restore transparency and accountability in the era of AI-generated content. As lawmakers and state legislatures continue to address AI regulation, it is clear that safeguards are being put in place to ensure the responsible and ethical use of AI technology.