New Senate Bill Targets AI Deepfakes, Calls for Content Watermarks


New Senate Bill Targets AI Deepfakes, Calls for Content Watermarks

  decrypt.co 12 July 2024 01:51, UTC

A the latest bid to curb AI-generated deepfakes, a bipartisan group of U.S. senators led by Washington Senator Maria Cantwell announced on Thursday the introduction of the Content Origin Protection and Integrity from Edited and Deepfaked Media (COPIED) Act.

The COPIED Act calls for a standardized method to watermark AI-generated content so that it can be easily detected. It also requires AI tool providers to allow creators to attach information detailing the origin or “provenance” of their content in a way that cannot be removed.

Senators signing onto the proposed bill include Marsha Blackburn (R-Tenn.) and Martin Heinrich (D-N.M.).

The act “will provide much-needed transparency around AI-generated content,” Sen. Cantwell said in a statement. “The COPIED Act will also put creators, including local journalists, artists, and musicians, back in control of their content with a provenance and watermark process that I think is very much needed.”

Under the proposal, developing the method used to watermark content and data would be handled by the National Institute of Standards and Technology (NIST).

The COPIED Act also bans unauthorized use of content for AI training, calling for creator control and compensation rights. Enforcement of the act would fall to the U.S. Federal Trade Commission (FTC) and state attorneys general.

In November, the FTC claimed authority to enforce laws pertaining to artificial intelligence in the United States, saying that generative AI could “turbocharge” scams and emphasized its role in regulating AI to protect consumers.

“Artificial intelligence has given bad actors the ability to create deepfakes of every individual, including those in the creative community, to imitate their likeness without their consent and profit off of counterfeit content,” said Sen. Blackburn. “The COPIED Act takes an important step to better defend common targets like artists and performers against deepfakes and other inauthentic content.”

Generative AI developers have been in a prolonged legal battle with media outlets—including The New York Times and members of the entertainment industry—over copyright infringement claims.

Earlier this month, internet security and services company Cloudflare launched an “easy button” to block AI bots from scraping websites, aiming to protect content creators from unauthorized data harvesting. According to Cloudflare, the company found that AI bots accessed 39% of the top sites using Cloudflare, while only 2.98% took steps to block them.

The Screen Actors Guild–American Federation of Television and Radio Artists (SAG-AFTRA) and the Recording Industry Association of America (RIAA) were among those praising the introduction of the COPIED Act.

“Senator Cantwell’s legislation would ensure that the tools necessary to make the use of AI technology transparent and traceable to the point of origin will make it possible for victims of the misuse of the technology to identify malicious parties and go after them,” Duncan Crabtree-Ireland, SAG-AFTRA director and top negotiator, said in a statement. “We need a fully transparent and accountable supply chain for generative artificial intelligence and the content it creates in order to protect everyone’s basic right to control the use of their face, voice, and persona.”

Last year, SAG-AFTRA and the members of the Writers Guild of America (WGA) staged a months-long strike after contract negotiations, including how AI would be used in Hollywood, fell apart.

“Protecting the life’s work and legacy of artists has never been more important as AI platforms copy and use recordings scraped off the internet at industrial scale and AI-generated deepfakes keep multiplying at a rapid pace,” RIAA Chairman and CEO Mitch Glazier said in a separate release. “RIAA strongly supports provenance requirements as a fundamental building block for accountability and enforcement of creators’ rights.”

In April, British actor and musician “FKA Twigs,” whose real name is Tahliah Debrett Barnett, testified before the U.S. Senate Judiciary Committee about the importance of artists controlling their digital likeness and the need for legislation to prevent misuse without consent.

“We dedicate a lifetime of hard work and sacrifice in the pursuit of excellence—not only in the expectation of achieving commercial success and critical acclaim, but also in the hope of creating a body of work and reputation that is our legacy,” FKA Twigs told the committee. “I’m here because my music, my dancing, my acting, the way my body moves in front of the camera, and the way that my voice resonates for a microphone is not by chance. They are essential reflections of who I am.”

Representatives for Sen. Cantwell did not immediately respond to a request for comment from Decrypt.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top