A Case Over AI Voice Clones Poses First-of-Its-Kind Questions

Image: Lovo

A Case Over AI Voice Clones Poses First-of-Its-Kind Questions

A recently issued decision is drawing attention for the legal questions it raises about artificial intelligence, voice cloning, and individual rights. In a proposed class action lawsuit filed against AI voice generator Lovo, Inc., professional voice actors Paul Lehrman and ...

July 16, 2025 - By TFL

A Case Over AI Voice Clones Poses First-of-Its-Kind Questions

Image : Lovo

key points

A judge has allowed key claims to proceed in a lawsuit by voice actors, who allege that Lovo used their voices w/o consent.

IP law offers limited recourse but the court held that right of publicity, consumer protection, and contract claims can move ahead.

The case raises first-of-its-kind legal questions about consent, identity, and ownership in the age of AI-generated content.

Case Documentation

A Case Over AI Voice Clones Poses First-of-Its-Kind Questions

A recently issued decision is drawing attention for the legal questions it raises about artificial intelligence, voice cloning, and individual rights. In a proposed class action lawsuit filed against AI voice generator Lovo, Inc., professional voice actors Paul Lehrman and Linnea Sage allege that Lovo used recordings of their voices to create and sell AI-generated voice clones – without authorization or proper compensation. The suit pushes into new legal terrain, and in doing so, it confronts “a number of difficult questions, some of first impression,” according to a New York federal judge, who recently issued a ruling on Lovo’s motion to dismiss.

The Background in Brief: In May 2024, Lehrman and Sage lodged right of publicity and false advertising claims against LOVO, Inc., a startup in the business of selling “a text-to-speech subscription service that allows its clients to generate voice-over narrations at a fraction of the cost of the traditional model.” They claim that Lovo employees acquired recordings of their voices via Fiverr in 2019 and 2020 – albeit under the guise that they would use the recordings solely for internal research purposes. In reality, Lehrman and Sage claim that Lovo used their “voices and/or identities to create millions of voice-over productions without permission or proper compensation.”

Lovo sought to sidestep the claims, arguing in a motion to dismiss that the plaintiffs’ claims are legally insufficient because their voices were not protectable under trademark or copyright law and that the alleged contracts, which they base on Fiverr message, are unenforceable.

A Case of First Impression

In his July 10 decision, SDNY Judge Paul Oetken granted in part and denied in part Lovo’s motion to dismiss. Judge Oetken allowed several key claims to proceed, including the plaintiffs’ right of publicity claims, their breach of contract claims based on the Fiverr transactions, and their state consumer protection claim. However, the court dismissed their false endorsement and copyright infringement claims, reasoning that the plaintiffs’ voices had not been used in a way that suggested endorsement, nor were they independently copyrightable.

Venturing into legally unsettled territory, Judge Oetken stated that the case “carries potentially weighty consequences not only for voice actors, but also for the burgeoning AI industry, other holders and users of intellectual property, and [even] ordinary citizens.” A few of the novel questions at play in the case …

> Is a cloned voice protectable under federal IP law? The court found that the actors’ claims mostly fell short. Under trademark law, plaintiffs must show that a mark – here, arguably, a voice – identifies the source of goods or services. But the court said that Lehrman and Sage’s voices, while unique and recognizable, were not used in a source-identifying way by Lovo. Instead, they were the product being sold.

> Can state law step in where federal law stops? While federal IP law may not yet offer a clear remedy, the court emphasized that misappropriation of one’s voice could be actionable under New York’s Civil Rights Law, which protects individuals from the unauthorized commercial use of their name, portrait, picture, or voice. These claims survived, alongside claims for consumer protection and breach of contract.

> Is a Fiverr DM a contract? Yes, at least at this stage. The court found that the Fiverr message exchanges – along with the site’s terms of service – constituted sufficient written agreements with clearly defined limits on use. The judge was unpersuaded by arguments that lack of legal names or formal signatures doomed the contracts.

The Bigger Picture

This is not just about two actors and one AI company. The AI-centric lawsuit provides some early insight into how courts will treat consent, identity, and compensation in an era when software can replicate a person’s voice – and, increasingly, likeness – at scale. Lovo positioned its AI as a cheaper alternative to human voiceover work, but that cost savings may have come at the expense of the actors’ control over their own voices and the cost of litigation.

At a moment when generative AI tools are being used to create realistic imagery, videos, audio, and text content, the lawsuit illustrates the apparent lag between technological capability and enforceable legal boundaries. For now, federal IP law offers limited recourse when someone’s identity is cloned by AI. But state-level personality rights may provide a backstop – at least in jurisdictions like New York.

As AI continues to blur the lines between real and manufactured, cases like this one may help define the legal limits of consent, authorship, and personal control in the digital age.

The case is Lehrman et al. v. Lovo, Inc., 1:24-cv-3770 (SDNY).

related articles