AI clones and the plagiarism problem for actors and voice artists

Summary:

  • Actors and voice artists face plagiarism-like issues as AI clones their voices or likenesses without consent, undermining creative control and earnings.
  • English law currently lacks clear protections, leaving performers reliant on patchy remedies such as passing off, copyright, or data privacy laws.
  • Growing industry concern has sparked calls for new UK legislation, clearer personality rights, and stronger ethical safeguards against AI misuse.

The idea of performers being digitally resurrected or cloned is no longer science fiction. Fans today can watch ABBA perform live in concert even though the legendary pop group’s members are in their seventies and not physically on stage. This is possible because ABBA appears as lifelike digital avatars of their younger selves, singing and dancing in a London residency show that they officially run. However, unlike traditional tribute acts, these “ABBA-tars” have the band’s full consent and participation. The technology demonstrates exciting possibilities, but it also raises profound ethical and legal questions – especially when actors and voice artists find their likeness or voice copied without permission.

When AI impersonation crosses the line

Stars react to stolen voices

New artificial intelligence tools can closely mimic a person’s appearance or voice. In some cases, these digital doubles cross into uncomfortable territory. For example, Sir David Attenborough was shocked to discover an AI-generated voice clone of him delivering political messages he never actually said. He described feeling “profoundly disturbed” that others had effectively stolen and repurposed his distinctive voice.

Similarly, Hollywood actress Scarlett Johansson recently accused an AI company of “plagiarising” her voice. She revealed that a demo of an AI voice assistant sounded so much like her that even close friends were fooled – despite her refusal to lend her voice to the project. Johansson was outraged at this imitation and took legal action, forcing the company to pull the feature. These incidents highlight the growing trend of AI impersonation, where a performer’s unique voice or image is replicated without their involvement.

AI voices in gaming draw backlash

Voice actors have also raised alarms. In the video game industry, some developers have started using synthetic voices in place of human actors to voice minor characters or background lines. When a major studio admitted to employing AI-generated voices for its latest game, voice actors and fans immediately protested. One voice artist likened the practice to artistic theft, arguing that training a model on real performers’ recordings and then generating dialogue is tantamount to plagiarism of their creative work. Another actor warned that studios might be eager to “get rid of the people who make the stuff” by replacing them with digital stand-ins.

Deepfake imposters and scams

This is not just a theoretical fear. There have already been deepfake advertisements and scams falsely featuring famous actors. Tom Hanks, for instance, had to warn fans about a fake dental insurance advert circulating online that used a deepfaked video of him. The Oscar-winner stressed he had “nothing to do with” the ad, which was created without his consent. Likewise, the family of actor Bruce Willis recently debunked rumours that he sold the rights to his “digital twin”, after a deepfake of Willis appeared in a telecom advertisement. These examples show how easily technology can appropriate a person’s likeness for profit or deception – leaving the real performers cut out of the picture.

Why copying a performance feels like plagiarism

Plagiarism is the act of presenting someone else’s work, ideas, words, or creative output as one’s own, without appropriate acknowledgment or permission. It includes directly copying text, closely paraphrasing ideas, or using another individual’s original content – such as art, music, or research findings – without proper attribution.

Exploiting an artist’s identity

Performers spend years developing their craft, honing a signature voice or screen presence. When an AI copy mimics that unique style without permission, creatives often feel it is stealing their persona or artistic identity. Unlike a mere impression or parody done by a human, a high-quality AI clone can be virtually indistinguishable from the real artist. Therefore, the clone isn’t simply “inspired by” the original – it explicitly aims to duplicate the artist’s own expressions, inflections and facial movements. To many in the creative community, that crosses an ethical line.

Crucially, such cloning can rob artists of credit and compensation. If a studio can generate, say, a convincing Morgan Freeman narration or a spot-on impression of a leading actor without hiring them, it undercuts the livelihood of those performers. It allows others to profit from a voice or face that isn’t theirs. Some writers and actors’ groups have bluntly labelled generative AI tools as “plagiarism machines”, because these systems often learn by ingesting real creative work and then regurgitating it in altered form. In the case of actors, the AI often learns from hours of an individual’s past performances, effectively data-mining their artistry to create an unlicensed replica. This is why many argue it amounts not only to an infringement of rights but a form of plagiarism – passing off someone else’s creative expression as original content generated by a machine.

Not all clones are unwelcome

There is an important distinction to be made here: not all digital reproductions are nefarious or unwelcome. Some ageing or deceased actors have authorised use of their likeness via technology in respectful ways. For example, the family of a late voice actor gave permission for an AI system to recreate his voice so he could “continue” a role in a video game sequel as a tribute. And as noted with ABBA’s virtual concerts, artists themselves can decide to partner with tech companies to extend their performances digitally – ensuring they control the result and receive the earnings. In contrast, what worries performers is unlicensed cloning, where their voice or image is used like a cheap sample, without credit or control. That scenario feels akin to plagiarism because it appropriates the fruits of one person’s creativity and reputation to benefit another party.

The legal grey area in English law

If a living actor’s image or voice is reproduced without permission in the UK, what laws are actually being broken? At present, this question has no easy answer. English law does not recognise a specific “personality right” or image right that grants individuals blanket control over their face, name or voice. Instead, performers must rely on a patchwork of existing laws – none of which were designed with AI clones in mind.

Passing off and false endorsement

One possible remedy is the common law of passing off, a form of intellectual property protection against misrepresentation. British celebrities have successfully used passing off in cases of false endorsement. (For example, singer Rihanna once won a case against a fashion retailer that sold T-shirts with her image, by proving it falsely suggested she had endorsed the product.) However, passing off requires proving that the defendant misled the public into believing the celebrity authorised the use. This is a high bar – it usually applies to advertisements or merchandise implying a famous person’s sponsorship. In the context of AI-generated content, if an unrelated film studio or YouTube creator makes a deepfake film featuring an actor’s likeness, they might avoid any explicit claims that the actor approved it. That could make a passing off claim difficult. And even if a case succeeded, the damages in a false endorsement claim might be relatively limited.

Copyright and performers’ rights

Copyright law is another piece of the puzzle, albeit a limited one here. Performers typically don’t own copyright in their appearance or voice itself. Copyright protects specific recordings, images and films (usually owned by producers or photographers), not the underlying person being recorded. If an AI creation uses actual clips or audio from an actor’s past work, that could infringe the copyright of those clips. In theory, a movie studio or record label could object to their footage being repurposed in a deepfake. But when an AI generates entirely new content that merely imitates an actor, it might not technically be copying any existing protected material – it’s creating new footage in the actor’s style. This falls outside the straightforward scope of current copyright law.

Performers do have certain performers’ rights, which give them limited control over recordings of their live performances. The law forbids recording or broadcasting a theatre actor’s live performance without consent, and prohibits selling copies of such a recording. Nonetheless, these provisions assume a direct recording of the actual performance. They do not clearly prevent someone from digitally recreating a performance from scratch. If an AI “re-performs” an actor’s role with new dialogue or scenes, the current performers’ rights may not apply, because no actual recording of the real actor exists in that process.

Privacy, data and other remedies

Another legal avenue is privacy and data protection law. Under UK data protection rules (aligned with GDPR), personal data includes any information that can identify an individual – voices and images can fall under that definition. If a company, without consent, scrapes an actor’s online videos or voice recordings to train an AI model, arguably it is processing that person’s data unlawfully. Individuals have rights to object and demand deletion of personal data used without permission. Indeed, voice or facial data could even be considered sensitive biometric data in some cases, affording stricter consent requirements. However, using data protection law to combat creative misappropriation is untested. It might work to force removal of an AI model or deepfake content in some instances, yet it doesn’t squarely address the core issue of protecting a performer’s artistry or likeness as such.

Other legal doctrines offer only partial help. A person who finds their deepfaked likeness placed in a defamatory or highly offensive context could potentially sue for defamation or for the misuse of private information. These routes depend on showing damage to reputation or a breach of privacy, and they do not apply simply because a performance was copied. Overall, UK law’s toolkit for a celebrity dealing with an AI clone remains patchy and often inadequate.

Calls for stronger protection

Unions push back against AI

Recognising these gaps, performers’ unions and many legal experts in the UK are pushing for reform. Equity, the British actors’ union, has warned that artificial intelligence is developing so fast that it could “exploit performers’ voices and likenesses” before the law catches up. The union notes that some members have already signed contracts allowing AI manipulation of their performances, sometimes without fully understanding the implications. Meanwhile, across the Atlantic, Hollywood’s major unions have made AI a central issue in negotiations. The recent actors’ strike in the US was fuelled in part by fears that studios might scan background actors’ faces or use voice clones, then reuse them indefinitely without pay – effectively cutting human artists out of future productions.

Government weighs new laws

In response to mounting pressure, the UK government has begun to consider new legal safeguards. It launched a consultation in late 2023 to ask whether the current framework offers enough control over one’s “personality” – meaning their image and likeness – in the age of AI. Early indications suggest the government is open to exploring a dedicated personality right: a law that would explicitly let individuals control commercial use of their name, image or voice. This would be a significant shift, bringing UK law closer to the publicity rights found in some other countries. Government ministers acknowledge that deepfake technology and digital replicas raise novel challenges that may require targeted legislation. However, they also caution that creating a new IP right for one’s likeness is complex, as it touches on free expression, technological innovation and existing intellectual property doctrines. For now, they are taking a cautious approach.

Interim measures in the industry

Industry self-regulation and contracts are attempting to fill part of the void in the meantime. Some studios have voluntarily pledged not to use an actor’s likeness without a deal in place. In advertising, the UK’s advertising codes prohibit using someone’s image or voice in ads without permission, offering at least a route to get offending ads withdrawn. And at the creative level, there is a growing consensus that certain ethical lines must be drawn. Many performers are now negotiating clauses in their contracts to restrict how their voice or appearance can be digitally altered or reproduced. High-profile actors have started to insist on approval rights if any de-ageing or AI doubling is planned in a production. These measures, however, rely on individual clout. Lesser-known actors may not have the bargaining power to demand such protections. Thus, without clear legal rights, it remains difficult for most performers to prevent or remedy unauthorised AI cloning of their work.

The road ahead

Blurring lines, new possibilities and risks

As AI technology becomes even more sophisticated, the line between a genuine performance and a synthetic one could blur further. This brings both opportunities and risks. On one hand, digitally revived performances might enchant audiences – imagine beloved actors of the past appearing in new stories, or present-day stars being able to “perform” in multiple projects at once via their digital twin. On the other hand, if left unchecked, this capability could enable a form of identity theft in entertainment. The concept of plagiarism, typically applied to copying text or music, may need rethinking when it comes to copying humans. Ultimately, society may decide that a person’s voice and likeness are an extension of their creative work, deserving protection just like a script or a song.

The need for balance and safeguards

The UK is at a crossroads on this issue. In the coming years, lawmakers will likely face tough questions about how to balance innovation with individual rights. If new laws grant actors and voice artists more explicit control, it could set boundaries on what AI companies and film studios can do with digital replicas. Conversely, not taking action might leave artists increasingly vulnerable to what feels like digital plagiarism of their talent. In the interim, raising awareness is key. The more the public recognises an AI-generated celebrity voice or face as a replication rather than the real thing, the less likely such content can deceive or cause harm.

Wrapping up…

The rise of AI in media forces us to confront what ownership we have over our identity and creativity. For actors and voice artists, whose livelihood is built on their personal expression, the stakes are especially high. Plagiarism in this arena is not about copying words on a page – it is about copying a persona, a voice, a soul of a performance. Ensuring fair and ethical use of these new tools will require updates in both our laws and our norms, so that technology enhances human artistry rather than hollowing it out.

Leave a Comment

Find us on: