Snapshot: Main Sequence v. Dudesy

The newest artificial intelligence (“AI”)-centric lawsuit to keep an eye on is one that was filed this week by the estate of George Carlin. On the heels of Dudesy, a media company in the business of creating AI-generated works, releasing an hour-long special featuring an AI-generated imitation of George Carlin’s voice on the Dudesy podcast’s YouTube channel on January 9, the late comedian’s estate has lodged right of publicity and copyright infringement claims in a federal court in California. (The special, entitled, “George Carlin: I’m Glad I’m Dead,” is still live on Dudesy’s YouTube.) According to the complaint, dated January 25, more than 16 years after Carlin’s death, Dudesy and its founders, comedian Will Sasso and writer Chad Kultgen, “took it upon themselves to ‘resurrect’ Carlin with the aid of AI.”

“Using Carlin’s original copyrighted works,” Dudesy LLC, Sasso, and Kultgen (collectively, “Dudesy” and/or “defendants”) “created a script for a fake George Carlin comedy special and generated a sound-alike of George Carlin to ‘perform’ the generated script,” according to Main Sequence, Ltd., Jerold Hamza as executor for the Estate of George Carlin, and Jerold Hamza in his individual capacity (collectively, “Carlin’s estate” and/or the “plaintiffs”). The plaintiffs assert that “none of the defendants had permission to use Carlin’s likeness for the AI-generated ‘George Carlin Special,’ nor did they have a license to use any of the late comedian’s copyrighted materials.”

Continuing on, Carlin’s estate, which is being represented by Boies Schiller, asserts that “the defendants’ AI-generated ‘George Carlin Special’ is not a creative work.” Instead, it is “a piece of computer-generated click-bait which detracts from the value of Carlin’s comedic works and harms his reputation, [and] it is a casual theft of a great American artist’s work.”

Against that background, they set out claims of violation of rights of publicity under California common law and deprivation of rights of publicity under Cal. Civ. Code § 3344.1; they are taking issue with Dudesy’s use of Carlin’s “name, reputation, and likeness,” namely, their use of “generated images of Carlin, Carlin’s voice, and images designed to evoke Carlin’s presence on a stage.”

The plaintiffs also set out a claim of federal copyright infringement, arguing that the defendants have “unlawfully used [the] plaintiffs’ copyrighted works for building and training a dataset for purposes of generating an output intended to mimic the plaintiffs’ copyrighted work (i.e., Carlin’s stand-up comedy).”

With the foregoing in mind, the plaintiffs are seeking monetary damages, as well as preliminary and permanent injunctive relief to bar Dudesy and co. “from directly committing, aiding, encouraging, enabling, inducing, causing, materially contributing to, or otherwise facilitating use of George Carlin’s copyrighted works to generate Dudesy Specials and any other contents created or disseminated by Dudesy, LLC relating to those Dudesy Specials.” Additionally, they want the court to order Dudesy to “immediately remove, take down, and destroy any video or audio copies (including partial copies) of the ‘George Carlin Special,’ wherever they may be located.”

The lawsuit comes as deepfakes and other replications of individuals’ likenesses continue to sound alarms in various industries, including (but not limited to) film/tv and music. A growing number of lawmakers is proposing legislation to address the threat of generative AI on this front, with Tennessee Governor Bill Lee, for example, announcing the Ensuring Likeness Voice and Image Security (“ELVIS”) Act, earlier this month. The bill would update Tennessee’s Protection of Personal Rights law to include protections for songwriters, performers, and music industry professionals’ voices from the misuse of AI. Before that, Senators Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC) introduced the Nurture Originals, Foster Art, and Keep Entertainment Safe or the NO FAKES Act back in October 2023 to address “the use of non-consensual digital replications in … audiovisual works or sound recordings.”

And still yet, it is worth noting that the introduction of the proposed NO FAKES Act and other proposed legislation comes amid a larger, AI-focused push for a federalized right of publicity among lawmakers and individual organizations, alike. While about half of U.S. states have recognized the right of publicity, a lack of uniformity among such state laws – and the resulting unpredictability – has prompted renewed calls for a federal statute in light of the rapid rise of deepfakes.

The case is Main Sequence, et al. v. Dudesy LLC, et al., 2:24-cv-00711 (C.D. Cal.).