Nvidia announced the availability of its generative artificial intelligence microservices, Nvidia Ace, intended to accelerate the next wave of digital humans. These technologies are already transforming industries like customer service, gaming and healthcare by making it easier to create, animate and manage lifelike digital avatars for more natural and engaging interaction. Digital humans are highly realistic virtual representations of people, created through advanced artificial intelligence and graphics technologies. These avatars are designed to interact with users much like real humans, using voice recognition, facial animation and even body gestures to make conversations and interactions more immersive and natural. Thanks to Nvidia Ace, these interactions become even more sophisticated, bringing us closer to a future where interacting with computers will be as natural as talking to a person.
Nvidia Ace includes several advanced technologies that work together to create these digital humans. For example, Nvidia Riva provides automatic speech recognition, text-to-speech conversion, and neural machine translation. These features allow avatars to understand and respond to voice requests accurately and contextually. Nvidia Nemotron, an advanced language model, helps understand language and generate appropriate responses, further improving interaction. Another key element is Nvidia Audio2Face, which enables realistic facial animation based on audio tracks. This means that avatars can express emotions and respond visually to conversations, making the interaction more vivid and authentic. Additionally, Nvidia Omniverse RTX allows you to simulate realistic skin and hair in real time, further enhancing the appearance of avatars.
Nvidia also introduced new technologies like Audio2Gesture, which generates body gestures based on audio tracks, and Nemotron-3 4.5B, a small language model designed for AI inference on RTX devices with low latency. Jensen Huang, founder and CEO of Nvidia, said: “Digital humans will revolutionize industries. Advances in multimodal language models and neural graphics, delivered by Nvidia Ace to our developer ecosystem, bring us closer to a future of computing intention-based, where interacting with computers will be as natural as interacting with humans.” Previously, Nvidia provided Ace as NIM microservices for data centers. Now, it is expanding this offering with the Ace PC NIM for deployment across an installed base of 100 million RTX AI PCs and laptops. This expansion includes the Nemotron-3 4.5B model, designed to run on devices with accuracy levels similar to large language models running in the cloud.
#Nvidia #presents #frontier #digital #humans