AI Is Making Voice Actors Nervous

(AP Photo/David Goldman)

Last week, PJ’s Matt Margolis wrote about a deep fake video of Joe Biden. If you didn’t see it, the Twitter account that posted it has been suspended. But it’s the internet we’re talking about here, so it’s floating around somewhere in cyberspace if you want to hunt it down. The video featured imperfect, but startlingly realistic “footage” of Biden lighting up transgender men and telling them that, no matter how hard they try, they will never be women and are destined for depression and doom. Biden of course would never say that, but the creator got darn close to making it look like he did.

Advertisement

In the very near future, so long as someone has a person’s image and voice sample, he will pretty much be able to make that person say whatever he wants. Motherboard recently published an article about 4chan users who may have used ElevenLabs technology to create deep fakes of celebrity voices. The results included deep fakes that portrayed Ben Shapiro making racist comments about Alexandria Ocasio-Cortez and Emma Watson reading excerpts from Mein Kampf.

Voice and image highjacking is going to open up a whole new world for lawyers and the courts. This will be particularly true with the disturbing rise of AI porn. But what about cases in which someone is contractually obligated to give up the use of his or her voice to a studio or production company? Voice actors are the latest group to worry about the latest effects of AI creep. According to Vice, voice actors are telling Motherboard that they are being asked to sign contracts that will allow companies to use their voices to generate vocal tracks via AI. Not only would this put the voice actor out of a job, but the person may forfeit the rights to any royalties from future products using their voice. Voice actor SungWon Cho told Motherboard:

It’s disrespectful to the craft to suggest that generating a performance is equivalent to a real human being’s performance. Sure, you can get it to sound tonally like a voice, and maybe even make it sound like it’s capturing an emotion, but at the end of the day, it is still going to ring hollow and false. Going down this road runs the risk of people thinking that voice-over can be replaced entirely by AI, which really makes my stomach turn.

Advertisement

Beyond the issues of respect and reality, other artists expressed concerns over AI using their voice without their consent or even knowledge. Another raised a point that a performer’s consent should be ongoing, noting that a performer can talk with a director or producer about any concerns with a script. With AI, it is full-speed ahead, no matter what the original actor’s objections may be. Conceivably (and I know it may be a stretch) an actor who is opposed to abortion or transgenderism could find his voice being used to promote views contrary to his own. Or for that matter, vice-versa.

Ideally, everyone should read the fine print before signing a contract. But Tim Friedlander, president and founder of the National Association of Voice Actors, told Motherboard that clauses allowing producers to use AI to manipulate an actor’s voice are becoming commonplace. In fact, in some contracts, such a concession may be mandatory. Friedlander said:

The language can be confusing and ambiguous. Many voice actors may have signed a contract without realizing language like this had been added. We are also finding clauses in contracts for non-synthetic voice jobs that give away the rights to use an actor’s voice for synthetic voice training or creation without any additional compensation or approval. Some actors are being told they cannot be hired without agreeing to these clauses.

Advertisement

Big deal, you say. Actors? Who needs ’em? Well beyond the fact that “then AI came for the voice actors” and will eventually come for you, there is the issue of reality. As Ben Shapiro recently pointed out, the problem with AI is not necessarily the technology but who controls it. In the not-too-distant future, he who controls AI will not just control the messaging but reality itself. And your reality will be whatever they say it is. And you won’t be able to prove otherwise.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement