Last updated on October 28, 2025
No. In states like Tennessee, California, and New York, using a cloned or AI-generated celebrity voice in a commercial ad without consent is prohibited under newly enacted right-of-publicity and digital replica laws. These laws treat voice as a protected aspect of identity, and violations can lead to civil liability and statutory damages.
The Sound of Fame Meets the Law
Picture this: a slick commercial featuring the unmistakable voice of a Hollywood icon—except the star never stepped into a recording booth. Instead, an AI model did the talking. Clever? Yes. Legal? Not in several U.S. states that have recently drawn a hard line against unauthorized voice cloning. In 2025, the battle over voice rights has become one of the most fascinating intersections of technology, creativity, and law.
Why Voices Became the New Frontier
For decades, the “right of publicity” protected names and likenesses, but voices were often overlooked. Enter generative AI, capable of mimicking tone, cadence, and emotion so perfectly that even die-hard fans can’t tell the difference. This technological leap triggered alarm bells in the entertainment industry, where a voice isn’t just sound—it’s brand equity. From music legends to movie stars, performers demanded legal safeguards against digital impersonation.
Voice is deeply personal. It conveys identity, emotion, and trust. In advertising, a familiar voice can evoke nostalgia and credibility, making it a powerful marketing tool. But when technology enables perfect imitation without consent, the ethical and legal stakes skyrocket.
The Legal Landscape
States have responded with targeted legislation. Tennessee’s Ensuring Likeness Voice and Image Security Act (ELVIS Act) explicitly includes voice simulations in its definition of protected identity. California amended its publicity statutes to cover “digital replicas,” while New York introduced Senate Bill 7676B, requiring contracts for any AI-generated voice use and granting performers the right to sue for unauthorized cloning.
These laws generally prohibit creating, distributing, or profiting from a digital voice replica without written consent. Penalties vary but often include statutory damages and injunctive relief. Some states extend protection posthumously, meaning even deceased celebrities’ voices are off-limits without permission. The message is clear: if you plan to use an AI-generated celebrity voice in an ad, get a license—or get ready for a lawsuit.
Federal Action on the Horizon
While there’s no comprehensive federal law yet, the Federal Trade Commission (FTC) has proposed new rules to combat impersonation fraud, including AI-generated voices. The FTC emphasizes that there is no “AI exception” to consumer protection laws and is considering expanding its impersonation rule to cover individuals, not just businesses and government entities. This would give regulators stronger tools to address deceptive practices involving voice cloning.
Congress is also debating the NO FAKES Act, which would create a nationwide standard for protecting voices and likenesses from unauthorized digital replication. If passed, this law could harmonize state-level protections and simplify compliance for advertisers and tech companies.
Cultural and Commercial Stakes
Why does this matter? Because voice is power. It conveys trust, nostalgia, and emotional connection—qualities advertisers crave. But exploiting that power without consent undermines both artistic integrity and consumer trust. The entertainment industry, already grappling with AI’s impact on music and film, views voice cloning as a tipping point for creative rights.
Celebrities have begun negotiating AI clauses in contracts, ensuring their voices cannot be replicated without explicit approval. Unions like SAG-AFTRA have also pushed for stronger protections, framing voice rights as part of broader labor negotiations in the age of AI.
Practical Implications for Brands
If you’re crafting an ad campaign, assume that any AI-generated celebrity voice requires explicit permission. Even “sound-alike” voices can trigger liability under state laws. The safest route? Use licensed voice talent or synthetic voices that don’t mimic real individuals. Transparency matters too—some states mandate disclosure when AI voices are used in commercial content.
Brands should also audit their creative workflows. Ensure that agencies and tech vendors comply with voice-specific laws and maintain documentation of consent. Failure to do so could result in lawsuits, reputational damage, and costly settlements.
Historical Context
The fight over voice rights isn’t new. In 1988, singer Bette Midler famously sued Ford Motor Company for using a sound-alike in a commercial, winning her case under California’s publicity laws. That precedent now echoes in the AI era, where technology amplifies the risk of unauthorized imitation. What was once a niche legal issue has become a mainstream concern shaping advertising, entertainment, and tech ethics.
Looking Ahead
As AI technology races forward, expect more states to adopt voice-specific protections. Legal experts predict that voice rights will soon rival image rights in importance, shaping contracts, licensing deals, and creative workflows. For now, the rule is simple: if the voice isn’t yours, don’t use it without consent.
See more on UNITED STATES
Sources
Artificial Intelligence Prompts Renewed Consideration of a Federal Right of Publicity – Library of Congress
https://www.congress.gov/crs-product/LSB11052
01/29/2024
FTC Proposes New Protections to Combat AI Impersonation of Individuals
https://www.ftc.gov/news-events/news/press-releases/2024/02/ftc-proposes-new-protections-combat-ai-impersonation-individuals
02/15/2024