NVIDIA Demonstrates Real-Time Generative AI in 3D Desert World Creation


Ted
Hisokawa


Aug
01,
2024
05:33

NVIDIA
researchers
showcased
real-time
generative
AI
capabilities
using
NVIDIA
Edify
at
SIGGRAPH,
highlighting
rapid
3D
world-building.

NVIDIA Demonstrates Real-Time Generative AI in 3D Desert World Creation

NVIDIA
researchers
have
demonstrated
the
impressive
capabilities
of
real-time
generative
AI
in
creating
immersive
3D
environments.
During
a
live
demo
at
SIGGRAPH
2024’s
Real-Time
Live
event,
they
showcased
how
NVIDIA
Edify,
a
multimodal
architecture
for
visual
generative
AI,
can
rapidly
build
detailed
3D
landscapes.

Accelerating
3D
World-Building

The
demonstration,
held
at
one
of
the
premier
sessions
of
the
prestigious
graphics
conference,
highlighted
how
an
AI
agent
powered
by
NVIDIA
Edify
could
construct
and
edit
a
desert
landscape
from
scratch
in
just
five
minutes.
This
technology
acts
as
an
assistant
to
artists,
significantly
reducing
the
time
required
for
ideation
and
the
generation
of
custom
secondary
assets
that
would
otherwise
be
sourced
from
repositories.

By
drastically
decreasing
ideation
time,
these
AI
technologies
empower
3D
artists
to
be
more
productive
and
creative.
Artists
can
generate
background
assets
or
360
HDRi
environments
in
minutes,
rather
than
spending
hours
finding
or
creating
them.

From
Concept
to
3D
Scene

Creating
a
full
3D
scene
is
typically
a
complex
and
time-consuming
task.
However,
with
the
support
of
AI
agents,
creative
teams
can
quickly
bring
concepts
to
life
and
continue
iterating
to
achieve
the
desired
look.
In
the
Real-Time
Live
demo,
researchers
used
an
AI
agent
to
instruct
an
NVIDIA
Edify-powered
model
to
generate
dozens
of
3D
assets,
including
cacti,
rocks,
and
a
bull’s
skull,
with
previews
produced
in
seconds.

The
AI
agent
then
used
other
models
to
create
potential
backgrounds
and
layouts
for
object
placement
within
the
scene.
The
demonstration
showcased
the
agent’s
adaptability
to
last-minute
creative
changes,
such
as
quickly
swapping
rocks
for
gold
nuggets.
Once
the
design
plan
was
in
place,
the
AI
agent
generated
full-quality
assets
and
rendered
the
scene
as
a
photorealistic
image
using
NVIDIA
Omniverse
USD
Composer.

NVIDIA
Edify’s
Capabilities

NVIDIA
Edify
models
assist
creators
in
focusing
on
hero
assets
while
accelerating
the
creation
of
background
environments
and
objects
using
AI-powered
scene
generation
tools.
The
Real-Time
Live
demo
featured
two
Edify
models:


  • Edify
    3D
    :
    Generates
    ready-to-edit
    3D
    meshes
    from
    text
    or
    image
    prompts,
    producing
    previews,
    including
    rotating
    animations,
    in
    seconds
    to
    help
    creators
    rapidly
    prototype.

  • Edify
    360
    HDRi
    :
    Uses
    text
    or
    image
    prompts
    to
    generate
    up
    to
    16K
    high-dynamic
    range
    images
    (HDRi)
    of
    nature
    landscapes
    for
    use
    as
    backgrounds
    and
    scene
    lighting.

The
demo
also
showcased
an
AI
agent
powered
by
a
large
language
model
and
USD
Layout,
an
AI
model
that
generates
scene
layouts
using
OpenUSD,
a
platform
for
3D
workflows.

Industry
Adoption

At
SIGGRAPH,
NVIDIA
announced
that
leading
creative
content
companies
are
utilizing
NVIDIA
Edify-powered
tools
to
enhance
productivity
with
generative
AI.
Shutterstock
has
launched
a
commercial
beta
of
its
Generative
3D
service,
enabling
creators
to
quickly
prototype
and
generate
3D
assets
using
text
or
image
prompts.
The
service’s
360
HDRi
generator,
based
on
Edify,
has
also
entered
early
access.

Getty
Images
has
updated
its
Generative
AI
by
Getty
Images
service
with
the
latest
version
of
NVIDIA
Edify.
Users
can
now
create
images
twice
as
fast,
with
improved
output
quality
and
prompt
adherence,
along
with
advanced
controls
and
fine-tuning.

Compatibility
with
NVIDIA
Omniverse

The
3D
objects,
environment
maps,
and
layouts
generated
using
Edify
models
are
structured
with
USD,
a
standard
format
for
describing
and
composing
3D
worlds.
This
compatibility
allows
artists
to
import
Edify-powered
creations
into
Omniverse
USD
Composer
seamlessly.
Within
Composer,
artists
can
further
modify
scenes
by
changing
object
positions,
appearances,
or
adjusting
lighting.

Real-Time
Live
is
one
of
SIGGRAPH’s
most
anticipated
events,
featuring
real-time
applications,
including
generative
AI,
virtual
reality,
and
live
performance
capture
technology.

For
more
details,
visit
the

NVIDIA
Blog
.

Image
source:
Shutterstock

Comments are closed.