State of Unreal | GDC 2018 | Unreal Engine

State of Unreal | GDC 2018 | Unreal Engine


♪ ♪ [APPLAUSE]>>Tim Sweeney: Well,
thank you all for coming. Now is a really exciting time, both for gamers
and for game developers. Right now we are 10 years into
the mobile gaming revolution. This is a revolution
that has brought more than 3.5 billion new computing
device owners into the market, you know, in the form
of smartphones and tablets, and billions of new gamers.
It is one of the greatest events in the history
of the game industry. It has had
this side effect that has been very noticeable
for the past decade, and that is that
the game industry has been split
really into two halves. On one side there are
the high-end PC and console gamers
and game developers, and this is Epic’s heritage,
you know, from Unreal tournament and Gears of War,
high-end games for core gamers. On the other hand, there have
been casual mobile games, starting with amazing games
like Angry Birds, and evolving on from that. But always, the game
industry has been split. There has also been a growing
degree of dysfunction in mobile gaming, which is that there are
literally over 100,000 games shipping every year
on these mobile devices, all competing for the same top
10 charts in the app store. That makes it really hard
for developers. A lot have been forced to resort
to ad-driven games, and sketchy monetization models.
We see this new trend now that is up-ending
the game industry. We think it is going to
be a great thing for everybody, and that
is that mobile gaming is now shifting
from casual gaming as a focus to serious
games for gamers. It is inherently
a flight to quality. This began in Korea. Last year, we told you about
Lineage II revolution. Now this is an open world
MMO with over 100 players on screen,
powered by Unreal Engine 4, that runs on iOS
and Android, and brings a truly
console-quality game experience to owners of all
of these mobile devices. That is just the start;
but now it is coming here. Rocket League hit PC and console
game is now coming to Switch. ARK: Survival Evolved — an amazing open world
dinosaur survival game; huge environments
and beautiful graphics. It is coming to iOS
and Android, and you will hear more about
that later in this presentation. PUBG has come to mobile. It is a completely
faithful version of the PC
and console game, now on iOS and Android
from Bluehole and PUBG Corp. Epic has brought
Fortnite to iOS now, and it is coming
soon to Android. The really neat thing
about Fortnite is, it is the full game.
It is the entire Fortnite experience
in Fortnite Battle Royale. It is fully interoperable
between the iOS, PlayStation, Xbox, PC and Mac. Gone are the day of ports
that eventually ship; we are in a new year of games that kind of support
all platforms, and fully support
gamers aspirations across their multiple devices. This is changing
the game industry fast. This is the iOS top
10 chart from yesterday; the top two games
are now hardcore games for serious gamers
powered by Unreal. As a result of
this transformation, Unreal adoption is growing
incredibly quickly now. Last year we announced
that three million developers had chosen
the Unreal Engine. Now more than five million
have installed Unreal on their machines. That is more than 10,000 new
developers coming in every day. That is three times the rate
of adoption of one year ago. We are really seeing
a transformation. These developers
are all sorts of people; they are indie devs,
they are students, they are pro game developers, and they are professionals
across all other industries using real-time
computer graphics and are now using
game engines for their work. So right now there is full
employment among Unreal developers.
In fact, for the first time now, we have a dedicated booth
on the GDC show floor, just showing games
where more than 20 teams have demos set up,
along with the Fortnite team about to show games and recruit
developers for the project – and from 1:00 p.m. onward,
free beer will be provided. [APPLAUSE] So please stop by
and join the party. It is an awesome
time in Unreal land. Powering this growth is
a really simple business model, and that is any
developer can download Unreal and start building
a game of any scale for free. Once you release your game, we ask for a royalty
of 5 percent of gross revenue. Now this is a business model
in which Epic only succeeds
when you succeed as a developer. It means our interests are completely aligned
in supporting you. As both an engine developer
and a game developer ourselves, you know, we bleed with you. We use our game building
experience to drive future engine improvements
and authorizations, and the continual process
of incremental improvement brings great
new features to everybody. For the last few months, a large
part of the Unreal Engine team has been helping
the Fortnite team ship on mobile devices, run at 60 frames
a second on console, and achieve other really
impressive technical goals. So I would like to invite
Nick Penwarden, Unreal Engine developer lead, onstage to talk about some of
the things we have done for Fortnite that are coming to
all Unreal devs. Thanks. [APPLAUSE]>>Nick Penwarden: Thanks, Tim. The engine team has had to solve
many technical challenges in shipping
Fortnite Battle Royale, especially shipping
the exact same game on PC, console and mobile devices. The Battle Royale map
is an open world that is over
six square kilometers, and consists of more
than 400 streaming levels; with cities
and other points of interest that are visible from more
than two kilometers away. While skydiving, players move
through the world quickly, and so we had to
make sure the engine could stream in
content hitch-free at 60 frames per second
before the player lands. We have moved more streaming
and serialization work off of the main thread, and tuned async loading
to saturate I/O bandwidth in order to make this possible. Simulating a Battle Royale
session with 100 players and more than 50,000 replicated Actors puts a lot of strain
on our dedicated servers. Networking updates are
our biggest bottleneck. Every object that players
interact with, such as walls and trees as well
as player-built structures, need to send updates
to all nearby players. We have been
optimizing replication to more efficiently figure out
which of those 50,000 Actors need to be replicated
to each player, as well as intelligently caching
and sharing that data among players,
whenever possible. Being a PvP game, smooth frame
rate and responsive input are crucial
to the experience, so we took up the challenge
to develop a 60 frames per second mode
for Battle Royale on consoles. On the rendering
side, we have added support for dynamic resolution, allowing us to set a GPU budget and maintain
a stable frame rate, while varying quality
as the action heats up. This is built upon
temporal up-sampling, which allows us to change
the resolution frequently every eight frames, without the change
in resolution being distracting. That way we can respond quickly
to increases in GPU workload. On the game thread side
of things, we have, of course, continued optimizing the engine
and moved more computation onto worker threads
wherever we can. We also approach 60 frames per
second as a scalability problem, adding more knobs
to control level of detail, like the number and quality
of animation updates, physics and effects, allowing us to hit 60
on the base Xbox One and PlayStation 4, as well as Xbox One X
and PlayStation 4 Pro. Speaking of scalability,
we leveraged all of that scalability
and optimization work to bring the full Battle Royale
experience to mobile devices. It is the same app, the
same number of players, and in fact, we support
cross-play so mobile players can be in the same game as
their friends on PC and console. We tuned Level of Detail
for Meshes, Animations, Sound and Foliage
to fit the game in memory and run at framerate. On mobile devices,
the number of draw calls we can issue each frame
was a major limiting factor. We developed some new
Mesh merging and instancing tools to help get
the number of draw calls down, while still making sure
that we render every Mesh that affects game play. On Android,
we have even added support for some of our parallel
rendering features, so that we have an entire core
dedicated just to submitting openGL
commands. All of the features,
optimizations and tools I have mentioned will be
available on Perforce and GitHub next month, and will ship
with the 4.20 release this June. I really cannot wait to see
what you all do with these changes
that we have made. I would like to invite everybody
to come back later today to our Tech Talks on Optimizing Unreal Engine
for Battle Royale. Thank you. [APPLAUSE]>>Tim Sweeney: Thanks, Nick. Also, on Monday we announced
that we were releasing a huge block
of content from Paragon for free to all Unreal
Engine developers. Now this package includes the
large majority of the characters from the game
and the environmental assets, which developers are now free
to use in any game powered by Unreal Engine of any
type. So you can go to MOBA or Brawler
or a Shooter or a Machinima and get the full benefit
of this content. You know, for three years we poured our hearts
and souls at Epic into building a MOBA
with photorealistic graphics that put players directly
into the action. For the team,
it built a really great game, but we were never able to
attract a large enough audience to really operate the project
as a triple A success. So when we closed Paragon, the team immediately
went into action to make this content available
to everybody. This is just another way
that at Epic, we are really trying to do
absolutely everything we can to help
other developers succeed in the way that we have
succeeded ourselves with Epic. [APPLAUSE] Another really interesting thing
we talk about every year is the growth of the Unreal
Engine outside of gaming. This has been really interesting
to see architects, industrial designers
and filmmakers all adopt the Unreal Engine for
their real-time graphics needs. These are industries
that have used non-real-time
graphics for decades, and now they find they can just
load their scenes in Unreal and it all works, and it is
completely dynamic in real-time. This has pushed us in ways that have really benefited
game development. You know, when a game developer
wants a carbon fiber material but we cannot quite render it, they will just choose something
else, and that will be fine. But when Ferrari
needs carbon fiber, you give Ferrari
carbon fiber, right? So this has delivered
a large number of really awesome new features that benefit everybody
across industries, including game developers. So to talk about some of
the really interesting things happening in film
and television, I would like to invite
Kim Libreri, Epic Games CTO,
to talk more. [APPLAUSE]>>Kim Liberi Thanks, Tim.
Hi, everybody. Pretty good crowd. So this year, we have been super
blown away with how many films, television shows
and animation companies around the world
are using Unreal Engine to make all sorts of amazing
content and entertainment. It is not only to create
cutting edge visuals, but to also liberate
creators to design, build and experience
stories in new ways; stories that are not
only limited to passive viewing, but also can invite fans to enter the world
of their favorite IPs. Here are some examples. Our friends at The Third Floor
are a frequent contributor to visualization
and virtual production of some of todays
biggest movies. They recently used Unreal Engine
to interactively visualize the most recent season
of HBO’s Game of Thrones. Real-time pre-visualization enables users
to rapidly turn around ideas and get these ideas across,
but also look great. Some of the cool capabilities
that The Third Floor have built
on top of Unreal Engine are demonstrated in this scene,
with new ways to scout sets, plan practical shoots
and frame the action virtually. This interactive sandbox
is truly transforming the way content can be visualized,
developed, and viewed. Speaking of virtual production, we are thrilled to announce
that Glenn Derry, here on the right,
and his team at Fox VFX Lab — he has got fans — Fox VFX Lab have chosen
Unreal Engine as the real-time platform
to power their vision for the future of filmmaking. Glenn is a superstar
in the movie world, and his team was the team that
pioneered virtual production for James Cameron’s Avatar. Glenn aims to blur the line
between what is real and what is virtual
on a movie set, and will offer filmmakers
a suite of capabilities that will really redefine
how a movie is made. In the world of TV animation,
Digital Dimension up in Montreal are producing a kids TV show called Zafari
entirely in UE4. They have seen
some huge gains not only in cost savings
and productivity, but also in the quality
of their final product. Working in real-time allows them
to experiment and iterate without the usual penalties of
traditional animation pipelines. As we know from making
games, iteration is king. And the more times
you can iterate, the better the final product
will be. The real game-changer
is this TV show, they can make one episode
a week, it is amazing. It is just unheard of
in the animation business. It is not just big companies
who are achieving success. Third World Studios, a small
animation company in Pakistan, created the
critically acclaimed, CG animated feature film, Allahyar and the Legend
of Markhor. A small team of only 60 artists,
animators and engineers made a full-length feature film created entirely
in Unreal Engine by using assets from the Unreal Engine
marketplace. They were able
to save valuable time and focus their engines
on the things that were really important
for their story. And with mobile GPUs evolving
as quickly as we are seeing today, it is only
going to be a matter of time that something as great
as this will be able to be played interactively
on your phone or tablet. Imagine the possibilities,
being able to take content, modify it, share it, and pass
it on to your next friend. Amazing. Speaking of movies, this year, the Academy awarded
a special achievement Oscar to Alejandro Gonzalez Inarritu
for his mesmerizing and unforgettable
VR experience, Carne y Arena. The last time one of these
awards was given out was in 1996 for Pixar’s pioneering work
in computer graphics for the original Toy Story.
It really speaks volumes that the Academy considers
interactive VR experiences as a new form
of cinematic storytelling. It is inspiriting what Inarritu
and the team at Legendary Entertainment and
ILMxLAB have been able to make. These are
all our friends, I have known them for many
years, and they have killed it. An Oscar for a VR experience —
it is an amazing thing. Now let us talk about
something important across all interactive
media types. Digital Humans.
Two years ago we were here on the stage with our friends
from Ninja Theory to show you a live version
of Senua from the game Hellblade interactively
being performed by Mel, the amazing actress
that plays Senua. It was a landmark presentation
that defined the possibility for live-performed,
real-time virtual character. Since then, Hellblade
has been released, and it has received
overwhelming success and amazing critical acclaim. It just goes to show
that a small team can produce a thought-provoking,
successful triple A quality game that not
only hugely entertains gamers, but also can carry
an important message. We are very proud of them. For this year, when it comes
to digital humans, we felt we could
push things even farther. We still thought
that there was work we could do improving
the quality of shading, especially on the skin
and the eyes; the lighting and also the quality
of the live performance capture
and facial animation. So we brought together
our friends at Tencent, 3Lateral, CubicMotion, Vicon and our
Epic Special Projects team to form a group
that would aim to take live capture digital humans
to the next level. So please let me
introduce you to Siren.>>Hello. I am Siren,
and I am a digital human. I was created
by an international team of artists and engineers, who wanted to challenge
our ideas of what a synthetic
human could be. I have got state of the art,
real-time graphics, and an unprecedented level
of detail in my eyes, skin and hair.
Cool, right? But I am more than just
a collection of fancy pixels. I am actually being driven
by a real human actress and her dynamic motion
capture through Unreal Engine. So what are you waiting for? Come meet me at the Vicon booth
and see for yourself.>>Nick: Cool.
[APPLAUSE] Thank you. Everything you just saw
was rendered real-time in Unreal Engine
at 60 frames a second. CubicMotion and 3Lateral
made crazy, significant improvements to the
live performance capture and facial
rigging technology, compared to what we were
able to do on Senua. We also entered a bunch of new
features in the engine anybody trying
to do digital characters for the game or experience
will find useful. We have Dual Lobe
Specular reflection, so the skin looks
more realistic. We have backscatter so light
will pass through ears and noses to give you
that light radiance and beauty that you see
in real photography. We also have Screen
Space Irradiance that allows light
to bounce off cheekbones and into eye sockets to really help
a characters eyes light up. We also added the ability
to stream animation data via Live Link, directly from
Vicons motion capture systems. And as Siren herself said, she is going to be performing
live every day at GDC. So if you go over to the Vicon booth, you will be
able to meet her, film her interactively,
talk to her, ask her questions; and Alexa is going to be there
to drive the character Siren. So from a rendering perspective, we feel we are getting
pretty close to crossing the uncanny valley
in real-time. Well, the devil
is in the details. You really need
a facial animation rig that can recreate all the
subtle nuances of a human face. This is slightly different
for every single person that you would want to digitize.
It is very easy to work out when something is
in the Uncanny Valley; it looks
weird and uncanny. It is actually
very hard to spot exactly what
the root cause is. The amount of times in dailies
when we look at a digital human and start arguing about
what is it, is it the eyes, is it the flesh, is it
the animation speed — all sorts of crazy things
get in the way. So to really understand how to
take things to the next level, you need a digitization system,
a capture system that can get every single detail
of an actors performance. So to explain more, let us
invite Vladimir Mastilovic from 3Lateral, our
good friend, to introduce you to some
new developments they have made in
digital face capture and animation technology.>>Vladimir Mastilovic: Thanks, Kim.
>>Nick: Here you go. [APPLAUSE]>>Vladimir: Thank you. So 3Lateral specializes
in what the world came to call
“digital life,” with a very specific
focus to digital humans. In that domain, we developed
technologies from acquisition to compression
and articulation of the data. We have just finished
new technology, which looks really great
in Unreal Engine. So great capture technology requires great acting
and great performance; so we are very excited
and very grateful that we have been able to work
on this project with famous actor,
Andy Serkis. Andys acting has been driving
some of the worlds most famous characters
in the past few decades. Only to mention a few, there is
Gollum from Lord of the Rings, Caesar in the Planet
of the Apes, and recently,
Snoke in Star Wars. Here is Andy to give you
an introduction to this project.>>Hi there.
Andy Serkis here, and what you are going to see
in the demo today is a recreating of
my performance in real-time, running in Unreal Engine. Thanks to 3Laterals
latest generation of 4D performance technology, you will see a performance
that is caught at the highest fidelity
possible today. Here is Vlad to explain more.>>Vlad: So we are about to see
a little glimpse into the future, one of the first samples
of true holographic data. We have captured Andy doing
a very well-known performance from Macbeth from Shakespeare. So let us take a look
at that in the Engine.>>Andy: And tomorrow — and tomorrow creeps
in this petty pace, from day to day, to the last
syllable recorded time. And all are yesterdays
of lighted fools, the way to dusty death. Out, brief candle! Life is but a walking shadow, a poor player that struts and
frets his hour upon the stage. And then he is heard no more. It is a tale told by an idiot, full of sound and fury
signifying nothing. [APPLAUSE]>>Vlad Thank you. [APPLAUSE] Even though that looked
like a pre-rendered video, that was actually running
in real-time in the ngine. Here is Colin from Epic Games. He is going to be
our virtual cinematographer, and he will help us see this
from different views. So not that we are
only capturing enormous amounts of data, but we are processing it through
3Laterals meta human framework. And we are compressing it up
to a million times to an asset, pretty much the same as assets
we have been using in production in the past eight years. This means that we do
not need hardware from the future to run this; it can run on current
generation hardware. It is also important
to note that it is not a simple playback. The technology is animating a
rig with beautiful, organic and noise-free function curves
for estimated muscle contraction that it sees in the data. And these curves are also
human-readable and editable. So even though no one
has tweaked this animation — it was done
via algorithm — an animator can use this
as a starting point. Colin can now demonstrate that
we can actually add an offset to multiple contractions
that we see in the performance, so he is just going to
do something subtle with the brows, just to
demonstrate the capability, because we want to be sensitive
to Andys performance as well. And we can make Andy look
on the other side — so you can see that it is kind
of a little bit sensitive area, because of the high fidelity. What this allows is,
it opens up new possibilities for targeting the animation
to a new face, new character, or simple but fundamental
things in production, like integrating the character
in the actual scene. It can be used
for experience design, like having Andy Serkis
directly addressing each member of the audience
in the VR space, which I personally feel
it is very powerful, and what actors have wanted
to do for a long time; and it opens up
a whole new world of level of immersion
of experience that is based
on performance presence. So seeing this process
at more abstract level, technologies translating
the data into universal, non-verbal communication. This has amazing potential
in research applications. To prove that we are solving it
universal non-verbal language, let us see the same data, but this time
on a fictional character. We have designed a mystical
alien character named Osiris, and we are applying
the same data from Andy Serkis without any tweaks
to the animation. Let us have a look at that.>>Osiris: Out, out, brief
candle! Life is but a walking shadow, a poor player that struts and
frets his hour upon the stage. And then he is heard no more.>>Vlad: So implications
of this technology are many. We have digitized an appearance
and activing of Andy Serkis. So at this level of fidelity,
the term digital humans is starting to get
its full meaning. His performance
can now be reshot, remapped, or saved for future generations. Maybe for Andy himself,
if in 20 years, he would like to act
with the face that he has right now.
Thank you. [APPLAUSE]>>Kim: Amazing! First of all,
thanks to Andy Serkis for being such a good sport,
and letting us digitize him, and work with him
for the last five weeks. That whole demo, from the
beginning of Andy showing up at 3Laterals place
to today was five weeks. It is a testimony to how
awesome Vlads technology is. Hold on, let me fast-forward. Okay, so the little
virtual camera app that you saw Colin playing with is something we are going to
release at 4.20 with the engine, so anybody who is a
budding filmmaker will be able to download it,
connect it to their iPad. The actual system,
the way it works, it is not magic,
it is actually ARKit. ARKit allows us to track
where the iPad is and then send that information
down to a PC that then can render
the graphics, and then remote stream
it back into the iPad. It means you can have
super high-end graphics running on the best
NVIDIA cards or AMD cards and stream it to the tablet. All three digital characters
that you just saw are going to be available in an
interactive demo at our booth, so you will be able to go in,
touch, zoom, look at the eyes,
look at all the details, and swap all the different
characters. If you are also interested
in learning more about how we made
these characters, we are going to do
a Tech Talk later for both Siren and Andy
here in this theater. Okay, so now let us talk about
augmented and mixed reality. In 2017, Chevrolet and The Mill showed the promise
of mixed reality with the human race
here on our stage at GDC. It was a sneak peek of what
is possible in UE4 for interactive marketing,
and how professional grade AR is going to change the way
that content is made. Peter Jacksons Wingnut AR
has also been doing some amazing ground-breaking
work in augmented reality. Last year,
Wingnut partnered with Apple to reveal ARKit at the Worldwide
Developer Conference. Wingnut were able to show off
a fully-explorable cinematic sequence in AR
that allowed you to punch in and see all of the details
of this scene. It also exhibited destruction
of a level that you normally
would only see in a movie, and it is all
running in iPad Pro. So it was a pretty great demo. We were very happy at what
they turned out, it was amazing. As Magic Leap herald a new era
of augmented reality, we are proud to announce support
for the Magic Leap One, and are super excited to see
what our early access customers are able to do
with this amazing device. While Magic Leap is bringing AR
to people’s homes, our friends from Riot have been
doing some incredible work with large-scale mixed reality. Please welcome to the stage
Kevin Scharff and Jerry O’Flaherty to talk about what Riot Games
and Zero Density were able to achieve at
the League of Legends World Championship.
Awesome, come on, guys. [APPLAUSE]>>Kevin: Hi there. Thank
you, Kim, and thank you, Epic, for giving us the opportunity
to talk this morning. I want to just discuss
one exciting way in which Riot Games
has leveraged Unreal to maintain
our dedication to and focus on our players.
We do a lot of things at Riot. In one area,
we dedicate resources towards the creative development
and elevation of our IP, often through project work adjacent to spaces of the core
League of Legends game. Riot values partnerships across
our various product groups so that we can bring
innovative and unique experiences
to our players, whether through CG stories,
music videos and albums, epic statues or even
tabletop games — we are always looking to offer
our players creative expressions
around the gaming lifestyle. And one of these expressions
is League of Legends as a premier eSport.
Even beyond live events, Riot eSports broadcast
to millions of viewers who watch from
all over the world, culminating in the World
Championship each fall. Our World Finals are like
the Super Bowl of our game, and Rioters teamed up this year
to bring a cutting edge in cinematic spectacle
to the world stage in China. The choice of the Birds Nest,
Beijing National Stadium, set the stage
for an epic set piece. The challenge was clear. Why not take an iconic character
from our IP and literally drop it
on the center stage in an augmented
reality experience? The question is, how do we
support such a huge event? First, this required
assembling a team and pipeline suitable for global
distributed development. We found Unreal to be
a critical, universal tool and language
that allowed us to collaborate and solve complex problems
with our partners. The goal achieved grand scale,
with both fidelity and impact. With that, I will
pass it over to Jerry to walk us through the process.>>Jerry: Thank you, Kevin. Augmented reality has been
a part of live sport broadcast since the 1990s; the first
in 10 line appeared in 1998. Since then, it has
become an integral part of many of the major sport broadcasts we see
around the world. From the billboards the
background to stats and logos appearing to float above
the field, to player selects and even champ selects;
we use it in a very similar way in our League Championship
Series broadcast. So for the World’s
Opening Ceremony 2017, we wanted to do something
epic, something that was going to blow
our players minds using augmented reality.
The idea, as Kevin said — take an iconic character,
the dragon, and bring him to life. Bring him to the Birds Nest
in China. To do that, we teamed with
the amazing folks at Passion Republic to help us translate the dragon
into cinematic fidelity, model him like
he is a real dragon, animated perfectly
to match the stadium, so that you can see
the fingernails curl over the edge of the roof. Be true to the style
and the aesthetic of the game, but imbue enough fidelity,
enough realism into it to allow it to match really well
with the live action plates. To do the integration
with the plates, we used the reality engine
from Zero Density. Built like a real-time
compositing tool, it allowed us to reach
into the image and manipulate almost
every aspect of it, right up until moments
before broadcast. Obviously,
testing something like this, something meant to be
in a stadium, can be a little problematic, so we used the Riot parking lot
to pull everything together right before
we shipped off to China. Dragons are awesome.
Why is this special? Live broadcast. Live broadcast
is unlike anything that I had ever done before.
You have live camera operators who have to hit their cues
perfectly every time. Two 40-foot jib arms with mechanical
tracking systems on them that have to stay in sync,
not lose calibration. You have live editorial
going on, cueing the cameras,
cueing your edits. You have a venue that was
changing underneath this. We had smog one day,
we had cloud cover the next day, a start time that shifted
a half hour to an hour. Lighting, shadows —
all of these things can change, completely breaking
the illusion. In the half dozen times
we rehearsed it right before live broadcast, I was shocked at just by how
seat-of-your-pants live broadcast can truly be. It requires incredibly agile
tools, technology and people to accomplish
something at this scale and at this fidelity.
Thanks to the Unreal Engine, our amazing partners and the
amazing team that worked on it, we were able to bring
the dragon to life at the Birds Nest in China. Here it is as it appeared
in the broadcast. [VIDEO PLAYS] Thank you!>>Kim: Fantastic!>>Jerry: Thank you.>>Kim: Wow! Mixed reality is going
to really change the way that people can go to massive
social events like that. It is awesome. Just think what it will be
like with AR goggles, amazing! All right. Okay, so now let us talk
about virtual reality. As VR technology matures, we are
super excited about the medium, and we really love
how it offers players and developers completely
new experiences. It is a really great place
to solve new sort of problems in terms of interactivity
and emotion, and we really love it. This year, location-based VR
is coming into its own. The opening of Star Wars
Secrets of the Empire, a collaboration between
The Void and ILMxLAB has given us a taste
of the first VR blockbuster. Here to talk more about
how UE4 powers this and other experiences
is my really good friend, Mohen Leo from ILMxLAB.
Hey, Mohen, how is it going?>>Mohen: Thanks, Tim.
[APPLAUSE] At the end of last year,
ILMxLAB and The Void debuted our hyper-reality experience,
Star Wars, Secrets of the Empire,
to fan and critical acclaim. Guests are transported
to the Star Wars Universe where, disguised as Storm
Troopers in teams of four, they must recover
intelligence critical to the survival
of the budding rebellion. Our collaborators at The Void
call their multiplayer experiences hyper-reality, because you not only see
the Star Wars Universe, you smell the ash
in the air, touch the environment
around you, pick up a blaster,
and more. It is complete
and total immersion in the Star Wars Universe,
powered by Unreal Engine. Secrets of the Empire
is now open in Glendale, Anaheim, Orlando and London, and the Las Vegas location
is coming soon. Stay tuned for
more later this year. At ILMxLAB, we want to let
audiences step into our stories and experience
cinematic worlds that look and feel as real as the ones
on the movie screen. Working as a visual effects
supervisor at Industrial Light and Magic, I witnessed,
over the last few years, real-time rendering
edging closer and closer to the photorealism we expect
from blockbuster cinema effects. But for real-time rendering to
achieve true cinematic quality, for film production
and interactive experiences, there has always been
one big missing piece. To address this, Epic has been
working closely with NVIDIA, ILMxLAB and Microsoft on some next generation
rendering technology. What we are about to show
you is an experimental project that we hope will give you peek
into the future of real-time
computer graphics. Let me welcome Epic Games’
own Jerome Platteaux and Marcus Wassmer
to the stage, who will demonstrate
for the first time, ever, interactive, real-time ray
tracing in Unreal Engine. [APPLAUSE]>>Mohen Over the last
decade, ray tracing has become
the core rendering technique for offline renderers
in visual effects, in spite of the often daunting
rendering times of hours, or even tens of hours
per frame, but there is no other way
to achieve the realism we need. Being able to use
these same techniques in real-time rendering
will create huge opportunities, both for film
and television production, and for interactive
entertainment. Jerome, are we ready?>>Jerome: Yes, we are.
Let us do this.>>Mohen: In the real
world, light sources come in complex
shapes and configurations. Cinematographers
use light patterns to carefully shape lighting
and reflections. With textured area lights, we can create these
same effects in real-time. Just as important
as controlling the lighting is shaping the shadows. Soft shadows can dramatically
change the mood of a scene. On film sets, we use diffusers
and bounce cards of different sizes to control the softness
of light and shadows. Real-time ray tracing makes it
possible to use these effects, even with moving light sources. Now let us look
at these characters in a more complex environment. You can see how ray traced
Ambient Occlusion really grounds the characters
in the world. But the best way to show off ray
tracing is obviously reflection, so let us bring someone
a lot shinier. [APPLAUSE] All of these lighting effects
and reflections also work
in dynamic environments. Improved cinematic depth
of field allows us to recreate the look and feel of real-world
cameras and lenses. Ray tracing is not just
about rendering characters. It is just as important
for creating realistic, complex environments. Whoa — let me reset that.>>Jerome, you want
to try the backup?>>Jerome: Uh-oh. I am here.
There we go.>>And we are back.>>Mohen: So rendering
all of these reflections on the fly allows control
over material properties, like roughness, and effects like
glossy reflections in real-time. The ability to ray-trace dynamic
character environments together with all of the complex
lighting effects, shadows, and inter-reflections
creates a level of realism previously impossible
in real-time rendering. To really show off
all of these techniques come together in production,
we decided to create a short, cinematic piece we call,
“Reflections.” What you are about to see
is all rendered in real-time in Unreal Engine. ♪ ♪ ♪ ♪
[Robot beeping]>>Stormtrooper 1: What is the
story with all the elevators lately?>>Stormtrooper 2: I heard
Kylo Ren destroyed the one over
in D-sector.>>Trooper 1: If you ask me,
who is ever in charge of this place should be transferred to Hoth. ♪ Imperial March elevator ♪
>>What? Oh. ♪ Imperial March continues ♪ ♪ Imperial March continues ♪
[Heavy footsteps] [Doors close]
[Siren wailing] ♪ Imperial March continues ♪
[Elevator engine whirring] [Stormtrooper straining
and groaning] [Doors open]>>Both: Hmm?
[Stormtrooper sucks in air] [Captain’s footsteps fade]
♪ Imperial March continues ♪>>Trooper 1: Mmm.
You think she heard us?>>Trooper 2: Yeah,
I think she heard us.>>Trooper 1: At least
we blend in, for once. [WOMPA HOWL]
♪ ♪ [Stormtrooper gasps]
♪ Drum Beats ♪ [ APPLAUSE ]>>Mohen That is all, thank
you. [APPLAUSE]>>Kim: Thanks guys!
I guess 700 cell phones and a WiFi network
do not mix very well. Anyway, so thanks
to everybody at Lucasfilm for letting us do that crazy
piece. It was a ball. And honestly,
with this level of realism, you can see how the worlds
of filmmaking and game-making are converging;
not only in the creative ideas, but also in terms of the ability
to have content that is totally photorealistic. A decade from now,
I think just like Tim says, you will not be able
to tell the difference between the real world
and the virtual world. Everything you saw
would not be possible without the support of our
awesome friends at NVIDIA, so I am going to invite
to the stage Tony Tamasi, who is going to explain about
all the hardware this ran on, and the software stack, and be generally awesome.
Hi, Tony.>>Tony Tamasi: Thank you.>>Kim: Welcome aboard. [APPLAUSE]>>Tony: So I do not know
about you guys, but I am about as geeked up
as I have ever been in maybe 20 years,
for graphics. Now I realize I am a bit
of a graphics nerd, but achieving that
level of realism in real-time has been a
dream of the industry for as long as I
have been in the industry, which is
several decades now. Really, it is because
this is kind of considered the Holy Grail
of rendering quality. You have heard it, they have been chasing this
for quite some time. The film industry
transitioned largely to ray tracing about a decade ago, but it really has
not been practical or even possible
to do this in real-time. Ray tracing solves a bunch
of the fundamental problems of traditional
rasterization-type graphics. It simulates the physical
behavior of light. It bounces its transmission
through different medium, and things like that.
But the problem has been, it has just been too demanding
historically to do that in real-time,
really, until now. So at NVIDIA, for the better
part of a decade, we have been working
on GPU-based ray tracing. We have developed technologies
for things like iRay and OptiX to accelerate
GPU-based ray tracing. And we are really now
at the crux of real-time ray tracing becoming
a practical reality right now. What you saw would not
have been possible without all of that investment in R&D that we
have put into it, and we have distilled
all of that technology, know-how and architecture
into something we call RTX. This is ray tracing technology
specifically targeted to accelerate
real-time ray tracing. It is a combination of hardware
and software algorithms, and it runs
on Volta-based GPUs. The demo you just saw ran
on a DGX station, which uses four NVIDIA Volta
GPUs connected with NVLink, to achieve that level
of, essentially, real-time cinematic realism. DXR is that core fundamental
acceleration technology that enables
real-time ray tracing, but you have to be
able to program it. So we partnered with our
friends at Microsoft to deliver an
industry-standard API that is called DXR,
or DirectX Ray Tracing. That API is perfectly
integrated with RTX to enable real-time ray
tracing through APIs and interfaces that game
developers are familiar with. On top of that, we have layered
in some GameWorks technology to give developers
a kick-start for things like denoising
and reconstruction filters for shadows and reflections,
and ambient occlusion. Then, of course, we have
had a long-standing partnership with our
friends at Epic, and we have worked
closely with them to integrate all of this
technology into UE4 to unleash what we
think is the next generation of graphics,
but it is here now. And I expect you are
going to see games shipping with real-time
ray tracing this year. Thank you.
[APPLAUSE]>>Kim: Thanks, Tony!
Thanks a million. We will be taking
a much closer look on how we created that content and the underlying technology
later in ta Tech Talk, here in this room. Now to talk more about
games momentum and what our amazing developers
have been doing with the engine. We would love to introduce
Dana Cowley to come on stage. Dana is going to blow you away
with some amazing video games. So speak to you later. I am off.>>Dana Cowley: All right.
Thank you, Kim. Hi! So there are no better experts in real-time technologies
than game developers. So let us talk about
how game developers are thriving in the heart
of consumer entertainment. And let us talk about
how Unreal developers are writing their own pages
in entertainment history books. This is a sampling of the
incredibly talented teams who are making the Unreal Games
community what it is today. At Epic, we are grateful to be
part of your accomplishments, and we are giving you
even better tools to help you keep
raising the bar, creatively and technically. We will continue to help you
be more successful financially. Revenue for Unreal developers is
setting new records every day. In 2017, Unreal Engine games earn more than $1 billion
on Steam alone. They are making high-quality,
critically acclaimed games that millions and millions
of people are playing and watching
and talking about. Take Player Unknowns
Battlegrounds, for example. PUBG launched
less than a year ago, and it is already the third
highest-grossing game on Steam in the platforms history. Rocket League —
another Steam top seller, now has more than 43 million
players across consoles and PC. On average,
6.5 million people are playing Rocket League
each month; and people have played
2.3 billion matches of Rocket League —
that is a lot of goals! So congrats to our friends
at Psyonix. Rocket League is playable here
in our booth at South 801 on Nintendo Switch. A year ago, System Era was here
on stage talking about Astroneer, which launched
at number one on Steam. In its first year
available in pre-alpha, Astroneer sold
over a million units, and it has been its top
100 seller on Steam for two years in a row. This was accomplished with
only 16 members at launch, and 15 full-time employees
a year later. This is what a small team
can do with Unreal. Astroneer just went
into alpha in December; you can play it now in Xbox Game
Preview and Steam Early Access. Let us talk about a few
incredible games that are coming soon
to everyone. We are going to take a look
at a very special game from a very talented team
of 15 people — give or take — PixelOpus is making their first
Unreal Engine game. At Epic, we fell in love
with Concrete Genie the first time
we laid eyes on it. Please give Dominic Robilliard
from PixelOpus a warm welcome. [APPLAUSE]>>Dominic: Thank you, Dana. So I am the creative director
of a small first-party team at PlayStation called PixelOpus,
as Dana mentioned. And I am here to tell you a
little bit about our current game, Concrete Genie, and how
our small 16-person team has leveraged Unreal Engine
to make it. Our goal at PixelOpus
is to make imaginative and beautiful games with heart, and after our first game,
Entwined, we decided that significantly
increasing the ambition and the scope of our next game could deliver that core idea
to a bigger audience. So after some preliminary
conversations about what we wanted
our next game to be about, we switched to Unreal. As Concrete Genie took shape
through concept and pre-production,
the things that we thought were special about it
started to come into focus, and led to some specific areas
of investment on the development front. I am going to walk through
some of those areas today. Cue the gameplay
footage, please. Concrete Genie is an action
adventure game about a boy who can bring
his paintings to life. The idea has presented
some interesting challenges in both the gameplay mechanics
and the rendering. We used the DualShock 4 control
as motion sensors to intuitively control
Ash’s magic paintbrush, and then the strokes
that you make with the brush literally come to
life as you paint them. We have added a new
rendering pass to house all of
the visual impact that this has to have
on the game, and that is what
is getting us all of the controlled lighting, the bloom and the movement that we
need for the painting gameplay. One of our main goals
with Concrete Genie is to make sure that anyone
who plays the game feels like they are an artist, no matter their real-world
artistic talent. So making sure that every
single mark you make in the game is as beautiful as possible
is really important to us. As well as the
beautiful landscapes you can make in Concrete Genie, you also get to make your own
cast of unique creatures to play with in this world. These creatures
also come to life, each with their own AI
and abilities, depending on how
you painted them. We have used and extended the Unreal behavior system
to prototype and develop all of the AI
for the creatures in the game, and they have got a
ton of helpful and charming things they
can do for you, as well as helping you solve
lots of different puzzles to help explore this world
that we have created. Having as much variety in these
creatures that you make, and also in how they can behave
and respond to you quickly became a huge pressure on
our limited animation resources. So we have made our own
toolset pipeline to animate all of the 2D creatures
as efficiently as possible. This is another thing
that has worked out really well for us
using Unreal Engine, constructing our own
components to bolt in and change
our workflow as needed. And this particular tool
allows our animators to create simple key frame
poses in Photoshop, and then we vectorize them
and interpolate between them to get
really smooth animations, but in a fraction of the time that you would normally
expect it to take. One of our main goals
at PixelOpus is to make games that have heart,
as I mentioned earlier on. For us, that means exploring
themes, characters and stories that resonate emotionally
for our players. We realized early on that
to deliver on that, we would need to find a way
for our characters to perform an emote convincingly
with our limited resources. The animators on our team are expert 2D animators,
as well as 3D, so we explored a new take
on animating expression textures across sculpted character heads, which topologically
are almost completely static. These 2D facial animations are
animated in Photoshop as well, and then compressed through
a command line tool to be mapped across all of
the characters faces in game. It has led to some really
powerful moments in our story, and it fits really well
with our stylized art direction. Another important aspect
of our look and art direction is to make the world
feel hand-crafted and tactile, so we have used Substance
Painter and Unreals World Align Texture function to not only create
expansive environments and detail-rich locations, but it also lets us make
large-scale adjustments to the world way more quickly
than you would expect. That has helped us keep gameplay
at the top of our priority list when it comes
to the level design, and also iterating
through our polish phase. There is so much more to show
and talk about in Concrete Genie,
and we really look forward to sharing more
as we get close to launch. For now, then, I would like
to thank Kim Libreri and Joe Kreiner at Epic
for inviting us here to speak and share a little more detail about how Unreal is helping
a small team make its dream game.
Thank you. [APPLAUSE]>>Dana: Wow. Thanks.>>Dominic: Thank you.>>Dana: Thanks. Thanks. Thanks.
Thank you, Dom. Concrete Genie looks
absolutely captivating! So Undead Labs rose to
independent stardom upon the 2013 release State of Decay. It was the fastest-selling,
original XBLA game of all time. Now they are about to set
the sequel which is built within Unreal Engine 4.
State of Decay 2 is one of the biggest games
coming to Xbox this year, so let us give a big welcome
to the studio head of Undead Labs, Jeff Strain. [APPLAUSE]>>Jeff: Thank you. So when we released
State of Decay in 2013, we were absolutely blown away by the response
from the gaming community. We had millions of fans tell us
that it was the first real swing at simulating
the survival apocalypse genre. We were humbled by the response, but we also knew that the sequel
would have to do more than just fall back
on the gameplay elements that made the first
one successful. And in particular we knew that it also had to have
more polish and better visuals. So that had to start
with a new Engine. But Undead Labs
is a game company; we are not a technology
or an engine company. Unless you are a mega studio, it is just really hard
to be both. We would rather spend out
technology budget on investing in those things
that make our games unique, such as the simulation engine that delivers
the State of Decay 2 experience, survival fantasy experience,
and technology that allows us to do things
like in the big, open world, actually go into
any building you see and explore the interior. So we knew we needed
to find a good engine, and a good engine partner. So we wanted not only superior
rendering technology, but also a robust toolset that would allow us to
rapidly prototype game systems. We needed to augment the game
with our own technology, because there is really no
out-of-the-box engine out there that is going to give us
all the tools and systems and features we need
for State of Decay 2. So extensibility was very
important to us. Most importantly, we knew
that choosing an engine was not just
about the technology, but also the partnership and
support that came along with it. Since I am standing up here,
you have a pretty good idea of how this decision
worked out for us. We wound up choosing
Unreal Engine 4 and Epic as our partner. That allowed us to focus
on making great games, and trusted Epic to focus
on making the great game tools. It turns out they are actually
pretty good at that. So UE4 Blueprints,
if you are familiar with those, gave us the rapid prototyping
tools that we needed, and the plugin architecture
under UE4 gave us the extensibility
that we needed. But more than that,
UE4 enjoys tremendous and robust
industry-wide support, so that allowed us to bring
in some other technologies, such as Enlighten, which gives us the real-time
dynamic lighting system that powers the fluid
day-night cycle in the game, and also trueSKY,
which is the technology that drives
the beautiful dynamic sky and cloud systems in the game. Choosing an engine is a huge
decision for any game studio, regardless of size, because we
make tremendous technology and infrastructure investments
around that decision. And we rely on our engine
partner to support the engine and maintain its technological
integrity over time. So we certainly made the right
choice for State of Decay 2, and we are very much
looking forward to seeing what we can do next
with this engine technology. State of Decay 2 releases
on May 22nd, so please go out and
buy it, and ask all your friends
to go buy it, too. Thank you very much. [APPLAUSE]>>Dana: Thank you so much
for the kind words, Jeff. State of Decay 2 looks awesome, it is going to be such a blast
to play with your friends. So I’m looking forward
to checking it out. ARK: Survival Evolved does not
need much of an introduction. Studio Wildcard has done
a phenomenal job of continually
improving the game, releasing loads of new content, and bringing it to Windows,
Mac, Linux and consoles; and it stayed at the top
of the sales charts. And now, here to give you
an update, everyone, please welcome Jeremy Stieglitz
and Jesse Rapczak. [APPLAUSE]>>Jeremy: Thanks, Dana.
Hi, I am Jeremy Stieglitz.>>Jesse: And I’m Jesse
Rapczak.>>Jeremy: So
since our full launch of ARK: Survival
Evolved last August, the Wildcard team has been hard
at work making new content and improvements to the game
for the ARK community.>>Jesse: In December,
we launched our second huge
expansion for ARK, ARK Aberration.
And this year, we plan to launch our
next expansion, ARK Extinction. But already, we have launched
two huge Dino DLC updates for our original pack
of dinosaurs that we launched closer
to early access, with great visual
and functional updates. And one of those
just launched yesterday.>>Jeremy: These updates include
lots of bug fixes as well, quality of life improvements,
and features added to the game based on community suggestions
and feedback. But also, ARK now
has over four thousand mods on Steam workshops, and millions of subscribers
to those mods, thanks to the amazing tools
and functionality found in the Unreal kit.>>Jesse: We are also really excited
that our partner teams have been helping to expand
the ARK universe in new ways. You may have seen
that Snail Games has created two new ARK experiences,
for consoles and PC. ARK Park VR, which launches this
week in PlayStationVR and Steam, and PixARK, which launches
this month on Xbox One Game Preview
and Steam Early Access.>>Jeremy: War Drum
Studios, a partner, has designed the mobile version
of ARK for iOS and Android. It was just recently
announced, it is free to play, and it has
the same experience as the console versions
of the game. ARK fans who have signed up
for the iOS Beta should check their inboxes, as War Drum is starting
to send out invites to the iOS version today,
I believe.>>Jesse: So the flexibility
of UE4 has enabled Wildcard and our partner teams to take
the massive ARK experience to so many new platforms
and places.>>Jeremy: So stop by Epics booth
to check out everything we have just talked about with ARK,
and all the new stuff. But speaking of new stuff, there is one more thing
we would like to show you.>>Jesse: This is something
you are going to be hearing about and seeing, exclusively here
for the first time today. This is ARK Survival Evolved
on Nintendo Switch!>>Both: On Nintendo Switch!
[APPLAUSE] So let us play it a little bit. So as you can see here, I am
a new survivor in the island. And these are my first moments. Normally in a game of ARK,
I would probably punch a tree, punch a Dino maybe,
harvest a few plants, build myself
a first initial base to try to survive the night. It is really amazing
how in the Switch we have actually managed
to incorporate all the same visuals
as every other console version. Just having it portable
in your hand is quite a unique experience, being able to take it wherever
you want to go.>>Jesse: Our team loves
this, because it means we can create our content
for fewer target looks, and know that that look
is going to translate to a full console experience
across all of these devices, including things
like multiplayer, and playing with your friends,
and joining tribes — all of that experience
that you can expect from ARK is available on the go
with the Switch.>>Jeremy: Let me go ahead
and hop on my trusty steed here, see if I can find
a small Dino to terrorize.>>Jesse: You’re so mean!
>>Jeremy: Well, you know, that is how you play ARK, right?
Have you played ARK?>>Jesse: Welcome to ARK.
>>Jeremy: Exactly. So we are really excited
about this version. It is just amazing
how Unreal has enabled us to achieve the parity
between platforms. And it is coming out
in fall, 2018. We are looking forward to
getting it to everybodys hands. Thank you so much!>>Jesse: Thanks a lot,
everyone! [APPLAUSE]>>Dana: Wow. ARK on
Switch — that’s huge! Content creators are
driving force and consumer entertainment. Fortnite has over 130 million
daily video views, or more. Now we are making it even easier to share the greatest gameplay
moments of Unreal-powered games. And we want to help you get
your content straight to Switch, YouTube and social media. Here to tell you more about this
is Zeke Mateus, video editor, and Michael Gay, director of cinematic production
here at Epic. Everyone, please make some noise
for Zeke and Michael! [APPLAUSE]>>Zeke: Thank you, Dana.
We are still amazed by all the content
the Fortnite community creates. As a video editor at Epic, I know it takes time
to put together a video, and yet every day there is tons
of new and wonderful content being shared,
from college dorm rooms celebrating Victory Royales
to duos with Ninja and Drake. I definitely
spend way too much time watching clips at work. I get paid to do that,
it is pretty weird. We love the content so much,
we wanted to share some of our favorites
with you now. [VIDEO PLAYS]>>Have you heard of Fortnite? If you haven’t, chances
are your kids definitely have.>>Gottit Baby, I got it. Let us go, baby! Yes!>>Shoot a rocket at me.>>Dude, it literally
just jumped up, like, 40!>>Oh! Lets go!>>Oh my God,
he may be falling for this trap guys.
Oh, let’s go!>>Shotgun baby, send it. [PLAYERS CHEERING] [APPLAUSE]>>Zeke: We are so thankful
for the Fortnite community and all the incredible content
that they create. But would it not be cool to fuel all that creativity
with something new?>>Michael: So the team has
been hard at work creating new tools that are going to make it
possible for content creators to create even more stuff. So we are excited to show you,
for the first time ever, the replay system that we built
for Fortnite Battle Royale.>>Zeke: In fact, we invited
one of the biggest Fortnite content creators to come and check out
the new tool early. He has over
12 million subscribers. And when you start
Fortnite on YouTube, his videos come out
before even our own. Please welcome Ali-A.
[APPLAUSE]>>Ali-A: Thank you Zeke,
and thank you, Epic Games, for having me here today.
A few weeks ago I headed over to the Epic Games
offices in the U.K., and actually got to get hands-on and use this brand
new replay system for the very first time.
I have to say it is amazing. For the millions of content
creators around the world, just like myself, it is going
to allow us to improve and make even bigger
and better content. Even for the everyday
Fortnite players, it means as soon
as a game is done, they can jump back in and review
that game play through the replay system
to see what went wrong, or what worked out on the road
to that Victory Royale. It is amazing. In fact, Zeke, I think we can
show the guys right here the new replay system in action
for the first time, right?>>Zeke: I think we can.
Michael, do you want to help us out?>>Michael: Let us do it. So what we are looking at here
is actually a replay that we recorded
when Zeke and Ali were playing. So let us take
a couple of minutes and see if we can create
a cool shot in the system.>>Ali-A: Let us do it. So this is
the brand new replay system. We can see the UI
along the bottom. There is tons of customization; we can speed up
and slow down the action, and view the world of Fortnite in a way that we have
never seen it before. Imagine you have got an amazing,
expensive camera, and in the world of Fortnite —
you can customize everything. So Michael here can change
everything from aperture, focal length, manual focus — all of that is available
for you guys to customize in this replay feature.
It is absolutely amazing. This is actually me
in game right now. He is my favorite skin,
the Wukong skin, it looks pretty damn awesome. And actually, as Michael said,
this is a gameplay shot both Zeke and I had
in Fatal Fields. It is pretty
action-packed, right?>>Zeke: I think so. You want
to check it out a little bit?>>Ali-A: Let us do it.
>>Michael: Sure. This is the rooftop battle
you guys had with those guys that were set up across the way.>>Ali-A: Yes. This is awesome.
And Michael is actually going to turn out
even more features here. We are going to be able
to see the weapons that everyone
is holding in game, their health, their shield —
all at a quick glance. You can get all of that
information, and even highlight enemy players
and teammates as well. So as this action
is going down, we are going to pan
around the camera, and see Fortnite in a way
we have never seen it before. This is the enemy team. They set up pretty nicely in the
house here in Fatal Fields. We are trying to take them down
from the barn across the way. As we are doing
so, Zeke actually lands a nasty
crossbow shot, taking out one of the players
— that was pretty good.>>Zeke: Thanks, man.
I have my shots, sometimes.>>Ali-A: It was
actually perfect and set up really well and as
Michael will show here, for me to rush
across Fatal Fields and start a cheeky
flank maneuver and try and go in behind
the enemy team and catch them off-guard — again, this is using the drone
follow, a new camera tool within the replay system
to view this — something we would not
have been able to do before. So the teams
have just ressed up. But Zeke,
your job is not over yet, because you need
to distract them.>>Zeke: That is right.
So I decided to try and
give you some cover. Things do not quite
go according to play. I start
getting shot at, and I think I actually
almost get blown up twice.>>Ali-A: Yeah, it is close.
It is very hairy. You take some damage.
But do not forget, I am flanking around
the back here, and you have done your job
to distract them. Before they know it,
I am actually in their base, pull out my tac shotgun,
take out one, take out two — and we win that gun fight.
As a result, I have a load of loot
to share between both of us. It was a great action piece.>>Zeke: That was pretty fun. You know, I am more of a hand
kind of guy myself, but the crossbow
did put in work. And Ali,
I think if I remember correctly, we went on to win this game.>>Ali-A: We
probably did, Zeke. That is a story
for another time. Having had a quick look
at the replay system here, myself and the guys
at Epic Games have actually worked together to put an amazing cinematic
piece together, to show you an example of what
you can create with this tool. I am sure you guys
are really excited to see it for the first time.
So without further ado, let us take a look at that video
right here, right now. [VIDEO PLAYS] ♪ ♪ ♪ Yeah. Once again, here we go.
Know the name, know the show. ♪ Turn me up a little more.
Setting traps on the law. ♪ Give me that city,
yeah, nitty gritty ♪ This ain’t gonna
be pretty, yea. ♪ Not afraid,
get up out of the way ♪ You all used to, hey,
look at what you make. ♪ When it all goes down,
I’m gonna run this town! ♪ I am my soul, got my eyes on the
goal. ♪ I can’t help myself ♪ baby, can’t turn
myself in. ♪ I’m in love with you baby,
but I let you down. ♪ I can’t die in this town.
♪ I won’t die in this town, hey! ♪ I won’t die in this town! [APPLAUSE]>>Zeke: Every
time I watch that, Ali, that
video — so badass.>>Ali-A: It was a lot of fun
to put together, and I am sure everyone here
and at home cannot wait to use
the new replay system.>>Zeke: You won’t
have to wait too long. The replay system will be
debuting in Fortnite soon, and will be available on PC
and console. And do not forget to stop
by the booth, because I think we have got
some nice shirts. You can claim one.>>Ali-A: Definitely.
>>Michael: Yep. We are also excited
to bring these improvements to the replay system
right into the engine, so developers everywhere
can put this into their games. You can get all of the stuff
in version 4.20. Also, if you stop by the booth, you can check out
the replay system in action and fly your own cameras around. I would like to thank Ali
for joining us on stage, and all the content
creators out there that are making games as fun to
watch as they are to play. Thanks, everybody!>>Ali-A: Thank you.
>>Zeke: See you at the booth! [APPLAUSE]>>Tim: That’s really
awesome. And you are seeing what
is happening here with Fortnite and Rocket League
and PUBG really highlights that games are now
becoming social experiences. They are much more fun to play when you are playing them
with your real-world friends. So Epic has been working
on a really extensive diplomatic effort on Fortnite
to lead the way forward to interconnecting all users
on behalf of all developers. And we are really grateful
that both Sony and Microsoft have agreed to allow
a cross-platform play between each of those platforms
and PC and Mac, and iOS and Android.
Right, so we have six platforms. There are 36 possible
connections. There is only one connection
that is not enabled yet, and that is the Sony
to Xbox connection. We are working with that
with all of these folks, and we are optimistic
that they will come. Underlying all of this,
there is a great value to interoperability
for everybody; for gamers,
for game developers, and for the platform
companies themselves. There is an economic
principle here that was first stated
by Mr. Metcalfe, the inventor of Ethernet. He said that the value
of a network for each user is proportional
with the number of nodes that each user
can connect to. Essentially,
the more connectivity, the more value
there is for everybody. So the value of a network
as a whole grows according to the square
of the number of nodes. That is all good and abstract, but what we are really
talking about here is not nodes and network,
it is people, right? These are our customers. They are our kids in school
who have friends in real life, and want to play games along
with their real-world friends. Now almost all of the platforms
are connected together, and we are making
really great progress. Based on all of the
economics we have learned, many of the most
valuable companies in the world are companies that help
connect users socially now, and deservedly so. So we think that
there is great value to Sony in connecting with Xbox players
and great value to Microsoft in connecting
with Sony players, and that everybody
in the whole industry would be better off with
these connections in place. So we will continue to advocate
on that behalf. [APPLAUSE] And you should, too.
You should, too. It is feeling
pretty lonely up here. But you know, at Epic,
we really believe this is the future of gaming, that we are building games
for gamers ourselves, and we are releasing the tools
for all developers to be able to do
this themselves. We are releasing games
across all of the platforms and transforming mobile
gaming from a casual experience of the Angry Birds class
to a first-class PC and console-quality experience
for serious gamers. You know, I think
by helping developers make games or social experiences themselves
remaining to enable linking gamers
together with their friends, and streamers
and content creators, like Ali-A at YouTube, and build a much larger
community around their games. This is a great and valuable
growth for the whole industry. This is not speculation
about the future, right? This is
happening right now, and it is happening
across many games, maybe powered by Unreal. We think it is a wonderful
time for the industry. So to conclude,
we prepared a little video to show off what is happening
with social gaming in Unreal today in 2018. But again, I would like
to invite you to come over to our booth
and check out our games, check out job opportunities,
and drink our beer. So thank you very much
for coming. And let’s take a look. [ELECTRIC HUM] ♪ ♪ ♪ If we work to recover,
we have champions crawling. ♪ I will celebrate. ♪ Life’s too short,
no time to waste. ♪ For the victory, this ain’t
no stoppin’ me. ♪ go hard go fast, lets go! ♪ All my winners stand up,
All my winners stand up, ♪ If you feel like we’re a
legacy, put your hands up. ♪ All my winners stand up,
All my winners stand up, ♪ If you feel like we’re a
legacy, put your hands up. ♪ ♪ ♪ All my winners stand up, ♪ ♪ ♪ All my winners stand up,
All my winners stand up, ♪ If you feel like we’re a legacy,
you’ll put your hands up.

100 thoughts on “State of Unreal | GDC 2018 | Unreal Engine

  1. Whats with the mobile games rubbish, playing a game on mobile is like driving a car with no hands, unless your playing candy crush…games like pubg on mobile lol, good luck.

  2. Mr. Serkis' performance is awesome! YEAP – one day we will hear "And the Oscar goes to" it's digital makeup from here on in.
    What else would you add?
    Huge amounts of spit flying out when delivering that performance. Ever been in the front row of a performance? You'll get a full 3D effect to take home.
    Add vibrations of the powerful plosive air forcing lips and face to move. Have the actor blow a HUGE raspberry for reference. watch the slow mo guys get slapped.
    Talking of spit in the air – the air was too clean and there was no lint or fluff wafting about. Just a bit of dandruff anywhere.
    BUT all this in REAL TIME.. we are living in the future. Those 4 Voltas will be desktop by 2022. 🙂

  3. Eventually polygon based engines will evolve into molecule based animation engines. Molecule based animations are much easier to process so we can only imagine how much more we will get using less powerful computers.

  4. Pretty soon they'll be doctoring the news and politics on a whole new level. Get ready to not trust anything you see.

  5. remember in the end day we will not know what is real or fake.
    that way you can make all the fake news about people you don't like .
    you know the people that try to save other people from pedifiles, sjw, isis, and democrats.

  6. For me the answer is already here. Why recreate real life when we can experience it now? the inherent unsatisfactoriness of mankind is what leads us to chase after something more. Its an endless and essentially fruitless journey. Only the realisation that we have what we need in ourselves and those around us can we truly be free and live our lives to the fullest.

  7. I think you need to focus on optimizing performance, idtech 6 on vulkan still is the king of graphics engine thanks for his graphics / performance. I suggest optimizing your vulkan implementation on unreal engine 4 so developers can use it more easily.

  8. That Switch ARK demo is a slide show. And nobody's going to tell me they're going to improve it to a playable level.

  9. So it takes 4x Titian V100s for a 1080p24 ray tracing demo? Ray tracing isnt as big of a deal and wont be coming to game anytime soon then lmao. People are all up in arms about it, its far off.

  10. The human replications were excellent but still have a little ways to go but when you apply the data to a non-human avatar it looks perfect.

  11. And still Unreal Tournament gets nothing…

    Lol i'm playing it with a 1080ti, 32gb ram, i7 6800k all low settings & i still can't average 240fps…
    Give UT some love epic.

  12. So they are using 90s graphic from movie industry xD
    Do you remember this graphic in the SF movies from the 90s? xD

  13. 22:31 controlling the camera with an iPad, wouldn't that be akin to James Cameron's tech used in avatar when he wanted that realistic camera movement by the natural shaking from the human hand?

  14. You guys MUST make a sci-fi MacBeth using Osiris. Looked fantastic.
    Congratulations on all of the many plaudits I’m sure you and your teams will receive as a result of this tech.

  15. Remember: The star wars "game footage" is run on quad-sli titan voltas. rip your wallets out if you want that kind of performance any time soon xD

  16. ark on switch looks awesome but something tells me it wont run well, will they really have the same quality standards as nintendo to have a good framerate

  17. I never thought of that, they can create a virtual world in this engine for film so the actors have a real sense of the space. Goodbye imagination.
    All the marvel cinematic universe actors can actually be on an alien world through virtual reality while they’re acting.

    I’m rambling and I’m not sure how they’d film the actors with VR gear on…..the fact an actor could get a peek at the virtual world through VR before their performance would probably help them get a sense of the world they’ll be inhabiting.

  18. Beam me up, Scotty.
    There’s already clones of all of us living their lives in Unreal Engine 4. A hellish virtual prison where all our clones are mining crypto currency as we speak…..😱

    Edit

  19. Great demonstrations, showcase and an exciting future for gaming and all media in general. Sadly, the main presenters Tim & Kim lack dramatic expressions & the enthusiasm to really hype the audience. They seem to just want to get the presentation over with and get the info out there to not have to deal with more stage fright.

    As a top level executives, surely they have had many chances to hone their public speaking skills. If not, they should hire someone to help them reach their audience better. They kept stealing the thunder from the different demos & presenters not allowing them to absorb their applaud and cheer, every time they came back on stage.
    No applause, no smiles, no laughter with excitement and roboticly nervous body language.

    Zeke was far a superior speaker and presenter then the rest (excluding Ali A), he should school these guys!

    I for one kept getting pulled away from this presentation due to poor presenter transitions & the mains inability to express, taking away partly my viewing experience.
    Overall, one of the best conferences from any Gaming or related company of the past few years. Keep pushing beyond!

  20. Very cool how much more believable the performance was on the alien face than on the human. I guess we are very attuned to human physiology but are more willing to take liberties with something so alien.

  21. The engine's graphics performance is fantastic, but totally meaningless to me since my current NVidia GTX 285 can't play any of the new games. Haven't bought a new game for over two years. Getting a new graphics card is the obvious answer, but I usually like to pay a bit less than MSRP and certainly never MORE than MSRP for ANYTHING. So until graphics card prices come back down to earth, I'll be spending my hard earned money on other forms of entertainment.

  22. Glad to see a company still run by a real nerd winning. Don't give in to the suits! Viva Epic! Esse quam videri!

  23. can anyone tell me what that camera app was during the Facial Capture section? incredible stuff… that would be very bloody handy!

  24. Is Unreal coming up with something similar to Unity's burst compiler? -Unreal might have the best graphics, but Unity is going to be superior on performance

  25. I love this stuff, brings me a lot of joy to see the gaming being creative again and building new and amazing things for gamers alike.

  26. I truly thank Epic for releasing the Paragon assets. It's been a huge help to test out production assets in the engine rather than waiting for art assets to finish. Now I can focus the majority of my time on learning level design and improving blueprints for my racing game.

    The biggest difference between Unreal and Unity (which I started off with) is that Unreal feels like it was made by a game developer. Everything feels more organized and practical, and critical rendering features that have to be purchased as add-ons in Unity have apparently long been standard in Unreal. I'm not a programmer by trade, but Blueprints makes sense. And that's an epic achievement in its own right.

  27. So are there any real actual playable games that have great graphics? I've seen plenty of videos about how "amazing" unreal engine is but not a single game where the graphics impressed.

  28. .. gee we used to play war with dirt clads .. more blood and guts .. there must be something good about this .. what is it ?

Leave a Reply

Your email address will not be published. Required fields are marked *