NVIDIA have announced ACE, a way to bring NPCs to life in games

NVIDIA have announced ACE, a way to bring NPCs to life in games

At their CES 2024 presentation, Nvidia have revealed ACE which is a simple name, but is anything buy a simple tool. This new program takes voice from the player, ie you and then feeds it into an AI, which generates a response and has a character say the words back.

Ok, so that is simplifying things a bit, so here is how it works in full.

  1. The player speaks to the character in the game

  2. That speech is captured and turned into a text file via the voice to text process, which is then sent to the LLM

  3. The LLM (Large Language Model) generates a reply in another text file

  4. A text to voice file is created and sent to back to the local machine where an Audio to lip sync model is used

  5. Once processed the audio and updated animations are sent to the game to render it in the scene

  6. The game presents the results to the player.

So all that is well and good, but what does it mean for you, I hear you say. Well part of the announcement is that Nvidia are releasing 2 of the modules out to development partners, like Ubisoft and Tencent.

The first aspect is handled on your machine, as well as the final few steps, but the second and third step can be done locally or via the cloud. What that means is that the game can be set up to do it on its own, or via a massive database of information. If this all sounds like sci-fi, check out the video below to see it in action.