NVIDIA ACE Brings Lifelike Digital Humans to Developers


NVIDIA ACE Brings Lifelike Digital Humans to Developers

NVIDIA
has
announced
that
its
suite
of
technologies,
collectively
known
as
NVIDIA
ACE,
is
now
generally
available
for
developers.
This
suite
is
designed
to
bring
digital
humans
to
life
using
generative
AI,
promising
advancements
in
areas
such
as
gaming,
customer
service,
and
healthcare.

ACE
Now
Available
for
Production
Deployment

According
to
the

NVIDIA
Technical
Blog
,
ACE
is
packaged
as
NVIDIA
NIMs
(Neural
Inference
Microservices).
These
microservices
enable
high-quality
natural
language
understanding,
speech
synthesis,
and
facial
animation.
Leading
companies,
including
Aww
Inc,
Dell
Technologies,
Gumption,
Hippocratic
AI,
Inventec,
OurPalm,
Perfect
World
Games,
Reallusion,
ServiceNow,
SoulBotix,
SoulShell,
and
Uneeq,
are
integrating
ACE
into
their
platforms.

NVIDIA
has
also
introduced
ACE
PC
NIM
microservices
for
deployment
across
the
installed
base
of
100
million
RTX
AI
PCs
and
laptops,
available
through
early
access.

Components
and
Updates

NVIDIA
ACE
24.06
introduces
general
availability
for
various
components
within
the
digital
human
technologies
suite,
including
NVIDIA
Riva,
NVIDIA
Audio2Face,
and
NVIDIA
Omniverse
RTX.
These
are
available
through
NVIDIA
AI
Enterprise.

Key
microservices
available
in
the
NVIDIA
NGC
Catalog
and
the
NVIDIA
ACE
GitHub
repository
include:


  • Riva
    ASR
    2.15.1:

    Adds
    a
    new
    English
    model
    with
    improved
    quality
    and
    accuracy.

  • Riva
    TTS
    2.15.1:

    Enhances
    representation
    for
    multiple
    languages
    and
    includes
    the
    beta
    release
    of
    P-Flow
    for
    voice
    adaptation.

  • Riva
    NMT
    2.15.1:

    Introduces
    a
    new
    1.5B
    any-to-any
    translation
    model.

  • Audio2Face
    1.011:

    Adds
    blendshape
    customization
    options
    and
    improved
    lip
    sync
    for
    Metahuman
    characters.

  • Omniverse
    Renderer
    Microservice
    1.0.0:

    Adds
    new
    animation
    data
    protocol
    and
    endpoints.

  • Animation
    Graph
    Microservice
    1.0.0:

    Supports
    avatar
    position
    and
    facial
    expression
    animations.

  • ACE
    Agent
    4.0.0:

    Adds
    speech
    support
    for
    custom
    RAGs
    and
    prebuilt
    support
    for
    RAG
    workflows.

Early-access
microservices
include:


  • Nemotron-3
    4.5B
    SLM
    0.1.0:

    Designed
    for
    on-device
    inference
    with
    minimal
    VRAM
    usage.

  • Speech
    Live
    Portrait
    0.1.0:

    Animates
    a
    person’s
    portrait
    photo
    using
    audio.

  • VoiceFont
    1.1.1:

    Reduces
    latency
    for
    real-time
    use
    cases
    and
    supports
    concurrent
    batches
    across
    GPUs.

Developer
Tools
and
Workflows

To
facilitate
the
integration
and
deployment
of
ACE
technologies,
NVIDIA
has
released
new
workflows
and
developer
tools
available
on
the
NVIDIA
ACE
GitHub.
The
Kairos
Gaming
reference
workflow
features
an
Audio2Face
plugin
for
Unreal
Engine
5,
and
the
NVIDIA
Tokkio
customer
service
reference
workflow
includes
various
tools
and
samples
for
digital
human
configurations.

Additional
developer
tools
include:


  • Unified
    Cloud
    Services
    Tools
    2.5:

    Streamlines
    NVIDIA
    Cloud
    Functions
    application
    deployment.

  • Avatar
    Configurator
    1.0.0:

    Adds
    a
    new
    base
    avatar,
    hairstyle,
    and
    clothing
    options.

ACE
NIM
Microservices
for
RTX
AI
PCs

Besides
data
center
deployment,
NVIDIA
is
bringing
ACE
NIM
microservices
to
the
installed
base
of
100
million
RTX
AI
PCs
and
laptops.
The
first
small
language
model,
NVIDIA
Nemotron-3
4.5B,
is
designed
for
on-device
inference
with
comparable
accuracy
to
large
language
models
running
in
the
cloud.
Early
access
for
Nemotron-3
4.5B
SLM
is
now
available,
with
Audio2Face
and
NVIDIA
Riva
ASR
on-device
models
to
follow
soon.

The
Covert
Protocol
tech
demo,
developed
in
collaboration
with
Inworld
AI,
showcases
the
capabilities
of
Audio2Face
and
Riva
ASR
running
locally
on
GeForce
RTX
PCs.

Getting
Started

Developers
can
begin
working
with
NVIDIA
ACE
by
evaluating
the
latest
ACE
NIMs
directly
from
their
browser
or
through
API
endpoints
running
on
a
fully
accelerated
stack.
NVIDIA
offers
tools
and
workflows
to
accelerate
integrations
and
early
access
to
microservices
to
help
developers
see
how
ACE
can
transform
their
pipeline
in
the
future.

For
enterprises
seeking
an
end-to-end
digital
human
solution
or
custom
development
on
ACE,
NVIDIA
Partner
Network
includes
service
delivery
partners
such
as
Convai,
Inworld
AI,
Data
Monsters,
Quantiphi,
Soulshell,
Top
Health
Tech,
and
UneeQ.

For
questions
or
feedback
about
digital
human
technologies,
developers
can
visit
the
Digital
Human
forum.



Image
source:
Shutterstock

.
.
.

Tags

Comments are closed.