In 1992, a jury awarded the singer Tom Waits the equivalent of $6 million in today’s dollars because Frito-Lay used a voice-alike in its Doritos ad and misappropriated his right to publicity. Bette Midler, Shirley Booth, and Bert Lahr also sued advertisers who used their voice-alikes in ads.
The reason for the lawsuits is that a distinctive voice is a recognizable component of a person’s identity, and the use of a person’s identity without consent to sell goods violates their right of publicity. With new technology, there will be many new incidents.
Once a privacy right, the right of publicity has morphed into a property right, transferable by license to an advertiser for commercial use. Associating a well-known celebrity with a commercial product can have exceptional value to an advertiser, thus their willingness to pay a king’s ransom for the right.
Akin to misappropriation of the right of publicity is the Lanham Act’s prohibition against using a person’s identity to create a false endorsement, either through action or words. False endorsement and right of publicity claims can exist together.
Most states have statutes creating a right of publicity. Maryland does not. Maryland’s relevant common law is old and at odds with the modern view that the right is a property right, as it is in most states.
Because a distinctive voice, like Waits’ or Midler’s is an element of a person’s identity, its use without permission to sell products violates the right of publicity. And, depending on the words spoken by the purloined voice, its use may create a false endorsement in violation of Lanham Act Section 43(a).
In a world where celebrities charge millions of dollars to lend their names and looks to help an advertiser sell goods, the damages caused by misappropriation, often measured by comparables, create an existential threat to a small or medium-size business’s survival.
New technology, specifically artificial intelligence, makes it a snap to recreate someone’s voice and use it in an ad. That’s one reason the Federal Trade Commission recently published the “Voice Cloning Challenge,” announcing it will pay for technological solutions to protect voice artists from misappropriation “in ways that threaten their likelihoods and deceive the public.”
The FTC has good reason for concern. OpenAI just announced it will launch a program able to clone any human voice from a 15-second sound clip. Dollars to donuts, some advertisers will use this tool to clone a celebrity’s voice and use it in an ad to hawk their products.
Worse, some advertisers will use the cloned voice to read from prepared scripts that have the celebrity endorsing the advertiser’s product, violating both the right of publicity and Section 43(a).
Some violations will be knowing, and others will occur because users of these programs do not know the rules of the road, exposing their employers to vast damages. These programs will likely be accompanied by a warning, but it’s another thing entirely for users to read the warning, understand the dos and don’ts of usage or obey the law.
Employers must be vigilant. They must adopt and circulate policies on use of these programs. They need to sit their employees down from time to time, whether in person or virtually, for instruction on what is and is not legal. They need to monitor their company’s marketing output. The stakes are enormous if they fail to do this.
James B. Astrachan, a partner at Goodell, DeVries, Leech & Dann, LLP, taught Trademark and Unfair Competition Law at University of Baltimore Law School for 23 years.
This article was originally published in The Daily Record.