NVIDIA Launches ACE Generative AI Microservices to Revolutionize Digital Humans


NVIDIA Launches ACE Generative AI Microservices to Revolutionize Digital Humans

NVIDIA
has
officially
announced
the
general
availability
of
its
ACE
generative
AI
microservices,
a
suite
designed
to
accelerate
the
development
of
lifelike
digital
humans.
According
to

NVIDIA
Newsroom
,
this
significant
technological
advance
aims
to
revolutionize
industries
such
as
gaming,
healthcare,
and
customer
service.

Expanding
Capabilities
with
ACE

The
ACE
(Avatar
Creation
Engine)
microservices
suite
includes
various
technologies
that
enable
the
creation
and
animation
of
realistic
digital
humans.
These
technologies
are
now
generally
available
for
cloud
deployment
and
early
access
for
RTX
AI
PCs.
Companies
such
as
Dell
Technologies,
ServiceNow,
and
Perfect
World
Games
have
already
started
integrating
ACE
into
their
operations.

Technologies
included
in
the
ACE
suite
are:



  • NVIDIA
    Riva

    for
    automatic
    speech
    recognition
    (ASR),
    text-to-speech
    (TTS),
    and
    neural
    machine
    translation
    (NMT).


  • NVIDIA
    Nemotron

    for
    language
    understanding
    and
    contextual
    response
    generation.


  • NVIDIA
    Audio2Face

    for
    realistic
    facial
    animation
    based
    on
    audio
    tracks.


  • NVIDIA
    Omniverse
    RTX

    for
    real-time,
    path-traced
    realistic
    skin
    and
    hair
    rendering.

Future
Technologies
on
the
Horizon

NVIDIA
also
announced
upcoming
technologies
such
as
NVIDIA
Audio2Gesture,
which
generates
body
gestures
based
on
audio
tracks,
and
NVIDIA
Nemotron-3
4.5B,
a
new
small
language
model
designed
for
low-latency,
on-device
RTX
AI
PC
inference.

“Digital
humans
will
revolutionize
industries,”
said
Jensen
Huang,
founder
and
CEO
of
NVIDIA.
“Breakthroughs
in
multi-modal
large
language
models
and
neural
graphics

delivered
by
NVIDIA
ACE
to
our
ecosystem
of
developers

are
bringing
us
closer
to
a
future
of
intent-driven
computing,
where
interacting
with
computers
is
as
natural
as
interacting
with
humans.”

ACE
in
Action

To
date,
NVIDIA
has
provided
ACE
as
NIM
(NVIDIA
Inference
Model)
microservices
for
developers
to
operate
in
data
centers.
The
company
is
now
expanding
this
offering
to
over
100
million
RTX
AI
PCs
and
laptops.
The
new
AI
Inference
Manager
SDK
simplifies
the
deployment
of
ACE
on
PCs
by
preconfiguring
them
with
necessary
AI
models
and
dependencies.

At
COMPUTEX,
NVIDIA
showcased
an
updated
version
of
the
Covert
Protocol
tech
demo,
developed
in
collaboration
with
Inworld
AI.
The
demo
allows
players
to
interact
with
digital-human
NPCs
using
conversational
language
to
complete
missions.

Industry
Adoption
and
Future
Prospects

Companies
like
Aww
Inc.,
Inventec,
and
Perfect
World
Games
are
pioneering
the
adoption
of
ACE
technologies.
Aww
Inc.
plans
to
use
ACE
Audio2Face
microservices
for
real-time
animation
of
its
virtual
celebrity,
Imma.
Perfect
World
Games
is
integrating
ACE
into
its
mythological
wilderness
tech
demo,
Legends,
enabling
players
to
interact
with
multilingual
AI
NPCs.

Inventec
is
using
ACE
to
enhance
its
healthcare
AI
agent
within
the
VRSTATE
platform,
providing
a
more
engaging
virtual
consultation
experience.
ServiceNow
showcased
ACE
NIM
in
a
generative
AI
service
agent
demo
for
its
Now
Assist
Gen
AI
Experience,
highlighting
its
potential
to
improve
customer
and
employee
interactions.

Innovations
at
COMPUTEX
2024

NVIDIA
art
teams
also
utilized
generative
AI
tools
built
on
ACE
to
produce
a
“digital
Jensen”
avatar
for
COMPUTEX
2024.
This
multilingual
avatar
featured
Huang’s
unique
voice
and
style,
generated
using
ElevenLabs’
AI
speech
and
voice
technology
in
Mandarin
Chinese
and
English.

The
ACE
NIM
microservices,
including
Riva
and
Audio2Face,
are
now
in
production,
adding
NVIDIA
AI
Enterprise
software
for
developers
to
receive
enterprise-class
support.
Early
access
to
ACE
NIM
microservices
running
on
RTX
AI
PCs
is
also
available.



Image
source:
Shutterstock

.
.
.

Tags

Comments are closed.