Postmortem Publicity Rights and Digital Use of Celebrity Likeness
Exploring the legal, contractual, and federal challenges of AI-driven celebrity likeness—from state statutes to the proposed “No Fakes Act.”
Published on
Due to advancements in artificial technology, Hollywood studios can now digitally recreate deceased actors. Key examples include the recreation of Peter Cushing in Rogue One and the decision to cast James Dean in an upcoming Vietnam War-era film titled Finding Jack. Instances of AI resurrecting dead actors, however, are leading to legal ramifications surrounding intellectual property law.
The Core Issue: Who Owns a Digital Ghost?
The right of publicity, protecting an individual's commercial interest in their name, image, voice, and likeness, emerged from Warren and Brandeis's seminal privacy framework but has evolved into a distinctly property-based doctrine. California Civil Code Section 3344 and the common law recognize this as an inheritable asset, with California extending protection 70 years post-mortem while Indiana provides a century of coverage. But other states like New York only recently enacted legislation to address the digital resurrection of actors. In many other territories, the protection simply doesn’t exist, or has not yet been tested in court.
The legal concerns associated with this situation are becoming a microcosm of the post-strike world. As studios stockpile data on actors' faces, voices, and mannerisms, the risk of unauthorized digital performances, particularly of deceased talent, becomes highly likely.
Contractual Anticipation of New Technology
California contract law generally requires explicit authorization for uses that extend beyond the original bargain's contemplated scope. In Eastwood v. Superior Court (1983), the California Supreme Court emphasized that publicity rights waivers must be "clear and unambiguous" regarding the specific uses authorized. AI performances arguably represent such a fundamental expansion of "use" that generic likeness language cannot suffice.
This creates a hierarchy of contractual strength as agreements with AI-specific language conflict with broad technology-neutral clauses and legacy agreements with traditional likeness provisions. Because of this, studios are facing increasing litigation risk, particularly when dealing with high-value estates willing to fight aggressive interpretations.
For example, in the case of the late Good Will Hunting actor Robin Williams, his estate turned down requests to use AI for recreating his voice in a documentary, pointing to specific instructions Williams had reportedly left prohibiting any posthumous digital manipulation. Other actors, however, may have already signed contracts without any conception that their performances might someday be digitally reconstructed.
Even when contracts do, in fact, include likeness rights, they do not necessarily specify in written language whether this encompasses AI-driven performances creating entirely new dialogue or scenes the actor in question never recorded or willingly endorsed. This contractual ambiguity has become a breeding ground for such legal disputes in need of resolution.
Jurisdictional Ramifications
Until recently, most conflicts surrounding AI and posthumous performances have been either resolved through confidential settlements or avoided altogether through risk-averse business practices. While California maintains some of the nation's most robust protections for post-mortem publicity rights of deceased individuals, how these statutory frameworks apply to AI-generated content remains uncertain.
The celebrity likeness rights doctrine, as it applies to digital contexts, awaits definitive judicial interpretation that will undoubtedly shape industry practices going forward.
The “NO FAKES Act”
At the federal level, lawmakers are starting to take notice. In late 2024, a bipartisan group in Congress introduced the “No Fakes Act:” No Artificial Intelligence Fake Replicas and Unauthorized Duplications Act. The bill would prohibit the unauthorized creation and distribution of digital replicas of individuals, with specific language protecting both living and deceased persons.
If enacted, the legislation would introduce critical and long-overdue doctrinal coherence to an area that is currently governed by state-level statutes, contractual provisions, and interpretive ambiguity. It would also give heirs and estates clear standing to sue when digital likeness rights are violated without authorization, even absent traditional filmed performances.
Studios Are Restructuring Their Contracts
In response to these legal shifts, some studios are beginning to update their talent agreements. Newer contracts often include AI-specific provisions requiring actors to affirmatively grant or deny permission for digital replicas, and sometimes setting rates or royalties for future use. These clauses are now becoming a standard part of negotiations, particularly with high-profile talent.
Long-Term Impact
There is a clear financial incentive to employing artificial intelligence in the scope of celebrity likeness rights. AI-generated performances are already being used in commercials, video games, and branded content. The question isn’t whether the practice will continue, but under what particular terms, and which parties will be compensated.
Next Steps
The NO FAKES Act remains under congressional consideration despite bipartisan support in both chambers. While the legislation has gained traction in the 119th Congress, its passage would mark a significant shift toward federal standardization in an area currently governed by a patchwork of state laws and contractual interpretations. Federal legislation could potentially provide the uniform framework that parties involved have been seeking.