Witnesses Tell Senate IP Subcommittee They Must Get NO FAKES Act Right

“It will take very careful drafting to accomplish the bill’s goals without inadvertently chilling or even prohibiting constitutional uses of technology to enhance storytelling.” – Ben Sheffner, Motion Picture Association, Inc.

The U.S. Senate Judiciary Committee’s Subcommittee on Intellectual Property met today to hear from six witnesses about a recently-proposed bill to curb unauthorized uses via artificial intelligence (AI) of an individual’s voice and likeness.

The “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023” (NO FAKES Act) was introduced in October 2023 by Senator and Chair of the IP Subcommittee Chris Coons (D-DE) and Senators Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC). The goal of the bill is to “protect the voice and visual likenesses of individuals from unfair use through generative artificial intelligence (GAI).”

In his opening remarks, Coons said that, since the text of the discussion draft was released last year, they have heard from many stakeholders and held dozens of meetings to obtain feedback. The feedback has centered on five key areas to address in the bill:

  • Whether there should be a notice and takedown structure for removing illicit AI content;
  • whether the right balance has been struck in the text with the First Amendment;
  • whether the bill’s 70-year post mortem term should be adjusted or narrowed;
  • whether the Act should have preemptive effect over related state laws; and
  • whether it should create a process for individuals with limited resources and minimal damages to enforce their rights.

“I think the NO FAKES Act is unique because it touches everybody,” Tillis, who is the Subcommittee’s Ranking Member, said. Unlike other IP bills that apply just to patent holders or creators, the bill would confer a right to control one’s visual likeness on everyone.

In perhaps a nod to problems the Subcommittee has encountered with opposition to bills like PERA and PREVAIL in the past, Tillis also commented that he hopes everyone comes to the table to seek compromise on the bill, considering its importance. “The only thing that really makes me mad is when I see somebody trying to, through guerilla warfare, undermine the good faith efforts of this committee and my colleague,” Tillis said. “If you’re at the table you can have an influence; if you’re not at the table you’re gonna be on the table. And If you’re in the category of if ‘it ain’t broke don’t fix it,’ you’re not up with modern times.”

The six witnesses who testified today included an artist named Tahliah Debrett Barnett (“FKA twigs”). A singer, songwriter, producer, dancer, and actor, Twigs explained that she is both using AI to enhance her career and also being exploited by it.

On the one hand, said she has created an AI version of herself that she can use to speak in multiple languages in her own voice, which helps her to reach and connect with fans more effectively, and said that AI also allows artists to “spend more time making art.” Simple one-liner press comments, for instance, can be delivered using AI. However, she has also had the experience of finding songs seemingly made by her online that she didn’t actually create or perform:

“It makes me feel vulnerable because as an artist I’m very precise. I’m very proud of my work and proud of the fact that my fans trust me because I put deep meaning into it. If legislation isn’t put in place to protect artists, not only will we let artists down who really care about what we do, but it also would mean that fans wouldn’t be able to trust people they’ve spent so many years investing in.”

The artist went on to say that it is hard to find the language to explain why AI needs to be regulated in this area because “it feels so painfully obvious to me” that artists need to have a right to control their own likeness. Tillis commented to laughter that a lot of issues Congress debates seem painfully obvious to everyone, so she’s not alone.

However, other witnesses, such as Ben Sheffner, Senior Vice President and Associate General Counsel, Law and Policy at the Motion Picture Association, Inc., said Congress needs to tread lightly lest they infringe on First Amendment Rights. While Sheffner said the NO FAKES Act is a thoughtful contribution to the debate about what guardrails should be in place, “legislating in this area necessarily involves doing something the First Amendment sharply limits—regulating the content of speech.” Thus, “it will take very careful drafting to accomplish the bill’s goals without inadvertently chilling or even prohibiting constitutional uses of technology to enhance storytelling,” Sheffner said.

As an example, Sheffner referenced the 1994 film, Forrest Gump, in which the digital technology of the time was used to create replicas of figures such as John F. Kennedy and Richard Nixon. Under the draft text, representations of such figures could require consent from their heirs or corporate successors, which would violate the First Amendment.

Sheffner said four key points need to be kept in mind when refining the bill:

  • Getting exemptions right is crucial or it will chill creativity;
  • the bill should preempt state laws that regulate the use of digital replicas in expressive works;
  • the scope of the right should focus on the replacement of performances by living performers only;
  • the definition of digital replica must be focused on highly realistic versions of individuals, not cartoon-like versions; and
  • Congress must stop to consider whether what it’s seeking to protect is already covered by other laws, like defamation, for example.

Lisa P. Ramsey, Professor of Law at the University of San Diego School of Law, also warned the Subcommittee about the First Amendment implications of the bill. Ramsey said the current draft of the NO FAKES Act imposes restrictions on the content of speech that don’t pass First Amendment muster. “When the Act applies to the use of digital replicas to impersonate people in fraudulent or misleading commercial speech it is consistent with the First Amendment,” Ramsey said. “The problem is that the current version of the Act also regulates non-misleading speech. It must be narrowly tailored to directly and materially further its goals and not harm speech protected by the First Amendment more than necessary.”

As drafted, Ramsey said the bill is not consistent with First Amendment principles because it’s “overbroad and vague,” but that a revised version could withstand scrutiny. She suggested three key improvements:

  1. While she said the bill does a better job than the No AI Fraud Act in setting forth specific exceptions for the First Amendment, it’s important to include a strict rule for online service providers that says specific and actual knowledge of use of unauthorized replicas should be required for any liability. Online service providers should also implement a notice and takedown system to make it easier to remove unauthorized deepfakes, which should be able to be challenged via counter-notification.
  2. Congress should create separate causes of action that target different harms. The use of deepfakes to impersonate individuals in a deceptive manner, uses of sexually explicit deepfakes, and uses that substitute for an individual’s performance that they typically would have created in real life should all have different requirements and distinct speech- protective exceptions.
  3. Each provision should adequately protect speech interests and preempt state laws on right of publicity and digital replica rights. Individuals should be able to consent to each licensing use of digital replicas, as allowing others to control an individual’s image through broad licensing agreements would work at cross-purposes with the Act.

On the topic of service provider liability, Graham Davies, President and Chief Executive Officer of the Digital Media Association, said liability should be focused on the creator and those first releasing the content. “Streaming services don’t have any way to know the complex chain of rights,” Davies said. He also said that new legislation should be developed from the existing right of publicity laws, rather than IP law, since there’s an existing body of case law.

Duncan Crabtree-Ireland, National Executive Director and Chief Negotiator of the Screen Actors Guild-American Federal of Television and Radio Arts (SAG-AFTRA), said he likes the fact that the bill provides for broader protection than just commercial uses, but said the current language doesn’t sufficiently limit the term of transfer or licensing an individual’s likeness. This could be a problem for a young arist, for instance. “I’d adopt a durational limitation on transfers and licenses during lifetime,” Crabtree-Ireland said. “It’s essential to make sure someone doesn’t improvidently grant a transfer of rights early in life that turns out to be unfair to them.”

He added that, while SAG-AFTRA members are strong advocates for the First Amendment, the Supreme Court has made it clear that the First Amendment does not require that speech of the press be privileged over protection of the individual being protected and that balancing tests should determine which right will prevail.

Robert Kyncl, Chief Executive Officer of Warner Music Group, agreed, explaining that First Amendment concerns over responsible AI are misguided. “AI can make you say things you didn’t say or don’t believe; that’s not freedom of speech,” Kyncl said.

He added that attribution through watermarking in order to determine provenance will be crucial and urged the Subcommittee to take the time to get it right. “We are in a unique moment of time where we can still act and we can get it right before it gets out of hand—the genie’s not yet out of the bottle, but it will be soon.”

Twigs, who is represented by Warner Music Group, reiterated the importance of giving artists back control:

“Ultimately, what it boils down to is my spirit, my artist[ry] and my brand is my brand and I’ve spent years developing it and it’s mine—it doesn’t belong to anybody else to be used in a commercial sense or a cultural sense, or even just for a laugh. I am me, I am a human being, and we have to protect that.”



Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com.

Join the Discussion

4 comments so far. Add my comment.

  • [Avatar for Anon]
    May 2, 2024 01:24 pm

    Apparently, “That Patent Guy” was not exposed to First Amendment law (or for that matter, Copyright law).

  • [Avatar for That Patent Guy]
    That Patent Guy
    May 1, 2024 06:02 pm

    I believe it was William Gibson, a pioneer writer in a sub-genre of science fiction called “cyberpunk” in its day, who quipped in the late 1990s or early 2000s that we have moved beyond a “record” culture and we are now a “remix” culture. Copyright law is solidly stuck in a “record” culture, and hasn’t evolved to the a world of sampling prior art and creating new art from a myriad of puzzle pieces. My personal opinion is that new law should censure actual counterfeits such as deepfakes, where a likeness of a real human person, in video or audio or both, is artificially generated and passed off as an actual act of the actual person. In hindsight, consider when comedienne Tina Fey portrayed Sarah Palin stating that she could see Russia from her house in Alaska. Sara Palin never said that, but the political damage in portraying her as ignorant was done, and even today a non-trivial number of people believe that Palin actually spoke those words. That was a legal, political hard-ball hit job, and I believe that emerging AI law needs to hew to that model, so that if a creator includes a disclaimer attached to a work that states that it is *not* the *real* Frank Sinatra or Aubrey “Drake” Graham, then a work in the style of the artist and clearly indicated as such should remain within Fair Use, especially if the imitation is distributed or exhibited at no cost or profit. But even if the 70 years rule (it was 50 years and then Sonny Bono skied into a tree) is applied, then it’ll only be a few decades and the actors and musicians of the near future will have to compete with AI-generated performances of powerhouse celebrities of the 1950s-1970s. It might be interesting and certainly entertaining to resurrect Buster Crabbe, Tom Mix, John Banner, Judy Garland, Veronica Lake, Jayne Mansfield, and many more as they give the artists of a few years hence a run for the money.

  • [Avatar for Anon]
    May 1, 2024 09:13 am

    Twigs wants an expansion of rights that currently do not exist in the US laws.

    As a reminder: US Copyright has limits. It was correctly pointed out that one source of those limits is the US First Amendment.

    As another reminder: Fair Use reflects limits of rights in a first instance. Those, like twigs, who want more than what is theirs (by right), should first come to understand why what they merely want is not want they currently have.

  • [Avatar for Chris Nixon]
    Chris Nixon
    May 1, 2024 06:28 am

    The solution seems pretty straightforward to me. The bill needs wording about intent, and what qualifications determine intent. For example I think you can clear up a lot of theoretical problems by saying it’s fine to create a digital likeness so long as it’s not being used with the INTENT of deception to make people believe it’s the real person.

Add Comment

Your email address will not be published. Required fields are marked *

Varsity Sponsors

IPWatchdog Events

Patent Portfolio Management Masters™ 2024
June 24 @ 1:00 pm - June 26 @ 2:00 pm EDT
Webinar – Sponsored by LexisNexis
August 22 @ 12:00 pm - 1:00 pm EDT
Women’s IP Forum
August 26 @ 11:00 am - August 27 @ 5:00 pm EDT
IP Solutions, Services & Platforms Expo
September 9 @ 1:00 pm - September 10 @ 2:00 pm EDT
Webinar – Sponsored by Anaqua
September 19 @ 12:00 pm - 1:00 pm EDT

From IPWatchdog