A new report from Good Luck Have Fun reveals that multiple triple-A game developers are using an AI program for voice acting instead of human talent. The company behind the technology is Altered AI, which contains a library of vocal performances, including around 20 professional voice actors.
Although it shares similarities with text-to-speech, which reads sentences back in audio, AI voice acting is considered more ethically dicey. As there are tools that can change a voice actor’s tone and voice type, there have been concerns that the technology could be used to supplant voice actors entirely.
NDAs mean that only two developers are explicitly named in the report. Hellblade: developer Ninja Theory allegedly has a partnership with Altered. Additionally, Neon Giant reportedly used Altered for the voice acting in its 2021 game, The Ascent.
The technology is typically used for prototyping purposes, according to Altered CEO Ioannis Agiomyrgiannakis, who argued that his company was doing for voice acting what YouTube has done for video.
“When you have a dialogue, you have a level of imagination. But when you take the dialogue to the voice actors, it comes back and doesn’t sound as dynamic as you wanted it to,” explained Agiomyrgiannakis. “We provide an intermediate step where they can prototype the dialogue and have a checkpoint before they hit the studio.”
Video games have had a rocky relationship with voice acting, with the US screen actors guild SAG-AFTRA going on strike in 2017 in the name of video game voice actors getting better pay and benefits. Several titles were noticeably affected during that strike, including the: Life is Strange prequel Before the Storm: and: Mortal Kombat 11:. A deal was eventually struck, which ends in November of this year.
The argument has been made that voice AI is especially helpful for indie developers, and can be used in tandem with established voice actors. But both Horizon’s: Ashley Burch and Yuri Lowenthal of: Marvel’s Spider-Man argued that there are easier ways around this.
“SAG-AFTRA has a low-budget agreement to address this issue,” acknowledged Burch. “It’s specifically designed so indie developers can get access to quality VO without breaking the bank. […] if you’re looking for something human and nuanced and alive, AI isn’t going to cut it.”
What does AI mean for voice actors and their image?
For Lowenthal, the worry is expressly about software like this being used to exploit actors’ talents in games that they otherwise wouldn’t be compensated for performing in.
“I know an actor who does a lot of performance capture and voice work and she has seen her very specific movement show up in games she never even worked on,” he said. “This is a scary precedent that has already been set, and I want to start a conversation with AI companies about how we could protect actors, and again, the ecosystem of storytelling.”
In a statement, SAG-AFTRA wrote that it would adapt its contracts to match technology’s evolution, and protect voice actors from their performances being used against them.
“These new technologies offer exciting new opportunities but can also pose potential threats to performers’ livelihoods. It is crucial that performers control exploitation of their digital self, be properly compensated for its use, and be able to provide informed consent,” wrote SAG- AFTRA.
“We know that change is coming. SAG-AFTRA is committed to keeping our members safe from unauthorized or improper use of their voice, image or performance, regardless of the technology employed. The best way for a performer to venture into this new world is armed with a union contract.”