Generative AI technologies are revolutionizing the way games are made and played. Game developers are exploring how these technologies can accelerate their content streams and deliver new gaming experiences previously thought impossible. One area of focus, digital avatars, will have a transformative impact on how players interact with non-playable characters (NPCs).
Historically, NPCs have predetermined responses and facial animations, where players can only communicate within a limited set of options. These interactions tend to be transactional, brief, and often ignored.
NVIDIAa leading company in the graphics and computational technologies sector, at the forefront of the development of artificial intelligence (AI) today unveiled production microservices for NVIDIA Avatar Cloud Engine technology (ACE) that enable game, tool and middleware developers to integrate cutting-edge generative AI models into the digital avatars featured in their games and applications. ACE models use a flexible combination of local and cloud resources that transform player input into a dynamic character response.
Models include:
NVIDIA Riva Automatic Speech Recognition (Riva ASR) to transcribe human speech.
NVIDIA Riva Text-to-Speech (Riva TTS) to generate audible speech.
NVIDIA Audio2Face (A2F) to generate facial expressions and lip movements.
NVIDIA NeMo Large Language Model (NeMo LLM) to understand the player's text and transcribed voice and generate a response.
NVIDIA announced that A2F and Riva ASR microservices are now available to middleware, tools, and game developers looking to improve NPCs in their game studios. In collaboration with Convai, NVIDIA Showcased Latest Version of Kairos Demo to Demonstrate How Next-Gen AI NPCs Will Revolutionize Gaming. Convai is an NPC development platform that makes it easy for you to enable characters in 3D worlds to have human-like conversations, perceptions, and action capabilities.
“Generative AI-empowered characters in virtual worlds unlock various use cases and previously impossible experiences. Convai is leveraging Riva ASR and A2F to enable lifelike NPCs with low-latency response times and high-fidelity natural animations,” said Purnendu Mukherjee, founder and CEO of Convai.
Open conversations with NPCs open up a world of possibilities for interactivity in games. However, conversations should have consequences that could lead to potential actions. For NPCs to perform actions, they must be aware of the world around them and interact dynamically.
With our partner Convai and their latest releases, we take our collaborative demo to the next level, enabling these AI NPCs with the following new features:
Spatial awareness: Enables game characters to interact and describe the world during conversations.
Actions: Enables game characters to interact with objects in the game world based on conversation, for example, delivering a bottle of sake when asked.
NPC-to-NPC interaction: Enables game characters to have generated conversations without player interaction.
Come on
has integrated the new NVIDIA ACE microservices, Audio2Face and Riva ASR. Game characters now get better lip sync, better expression, and accurate speech detection when listening to the player. NVIDIA is working with leading developers in the gaming ecosystem to create digital avatars using ACE technologies, including Charisma.AI, Inworld, miHoYo, NetEase Games, OurPalm, Tencent, Ubisoft and UneeQ.
“This is a historic moment for AI in games,” Tencent Games said. “NVIDIA ACE and Tencent Games will help lay the foundation that will bring digital avatars with individual, lifelike personalities and interactions to video games.”
“For years NVIDIA has been the Pied Piper of gaming technologies, offering new and innovative ways to create games. NVIDIA is making games smarter and more playable through the adoption of AI gaming technologies, which ultimately creates a more immersive experience,” said Zhipeng Hu, senior vice president of NetEase and head of the LeiHuo business group.
#Nvidia #Generative #technologies #gaming