Nvidia’s “Audio2Face” technology uses AI to generate lip-synced facial animations for audio files



AI innovation: Game development is an incredibly time-consuming and expensive process, with art and animation budgets often consuming a large chunk of a team’s cash reserves. In particular, believable facial animation is vital for titles with a lot of cutscenes. That’s why Nvidia is working on an AI-based tool that can read audio files in real time and create suitable facial animations; No mocap is required.

This technology called “Audio2Face” has been in beta for several months. It didn’t seem to attract much attention until recently, despite its potentially revolutionary impact on game developers (or just animators as a whole).

As you’d probably expect from any technology that is both AI-powered and still in beta, Audio2Face isn’t perfect. The quality of the source audio will have a big impact on the quality of the technique’s lip sync, and it doesn’t seem very good at capturing facial emotions. No matter what phrase you throw at Audio2Face’s standard “Digital Mark” character, the eyes, cheeks, ears, and nose all remain pretty static. There’s some movement, but generally it’s more subdued than the lip animations, which are clearly the focus.

But maybe that’s a good thing. Animators train for years to convey accurate emotions to a 3D character. Depending on how easy this tool is to implement into a particular developer’s workflow, it can provide useful, perhaps lip-sync, placeholder animations while animators can focus on other parts of a character’s face.

Some of our readers, or just plain hardcore fans of Cyberpunk 2077, may remember that CDPR’s title uses a similar technology called “JALI”. It used AI to automatically synchronize dialogues in all of the game’s main languages ​​(with subtitles and voice acting), which took the burden off the animators themselves.

Audio2Face doesn’t have that capability as far as we can tell, but it still looks useful. If we hear of cases where a developer has used the technology, we will let you know. If you want to try it out for yourself, the open beta is available for download, but you need to know that you will need an RTX GPU for it to work properly.

Source Link

Leave a Reply