SANTA CLARA — NVIDIA has launched production microservices for the NVIDIA Avatar Cloud Engine (ACE), marking a significant leap forward in the realm of lifelike digital characters for games and applications. These microservices enable developers to integrate cutting-edge generative AI models into their digital avatars, revolutionizing player interactions.
The ACE microservices introduce developers to state-of-the-art AI models like NVIDIA Omniverse Audio2Face™ (A2F) and NVIDIA Riva automatic speech recognition (ASR). These models empower developers to create interactive avatars capable of expressive facial animations driven by audio inputs and customizable multilingual speech and translation applications using generative AI.
Among the developers embracing ACE are industry heavyweights such as Charisma.AI, Convai, Inworld, miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft, and UneeQ.
Keita Iida, NVIDIA’s Vice President of Developer Relations, emphasized the transformative potential of generative AI in game creation and gameplay. “NVIDIA ACE opens up new possibilities for game developers by populating their worlds with lifelike digital characters while removing the need for pre-scripted dialogue, delivering greater in-game immersion,” said Iida.
Leading game and interactive avatar developers have already begun exploring the potential of ACE and generative AI technologies to redefine player-NPC interactions in games and applications.
Zhipeng Hu, Senior Vice President of NetEase, highlighted NVIDIA’s role in advancing gaming technologies. “NVIDIA is making games more intelligent and playable through the adoption of gaming AI technologies, which ultimately creates a more immersive experience,” said Hu.
Tencent Games echoed this sentiment, calling NVIDIA ACE a “milestone moment for AI in games” and expressing enthusiasm for the future of digital avatars in video games.
Traditionally, non-playable characters (NPCs) in games have been limited by predetermined responses and animations, resulting in transactional and short-lived interactions. However, ACE aims to change this paradigm by introducing lifelike NPCs with individual personalities and interactions.
Convai, a key player in this development, leverages Riva ASR and A2F to create lifelike NPCs with low-latency response times and high-fidelity natural animation. The latest iteration of Convai‘s framework, showcased in collaboration with NVIDIA through the Kairos demo, demonstrates the enhanced interactivity and awareness of NPCs, allowing them to engage in meaningful conversations, manipulate objects, and guide players through game worlds.
The availability of Audio2Face and Riva ASR microservices marks a significant milestone in the evolution of interactive avatar development, empowering developers to incorporate these advanced AI models into their projects.
NVIDIA’s ACE microservices promise to reshape the landscape of digital avatars in games and applications, ushering in a new era of immersive and dynamic player experiences.