Unreal 5 & The Augmented Reality Roadmap

Unreal 5 & The Augmented Reality Roadmap

The Internet was left gasping for breath after a preview of Unreal 5.

As a peek at what will be possible with the next generation of consoles, it was stunning.

But it isn’t just consoles that matter. Epic has other ambitions. And Unreal 5 also provides some intriguing insights into where augmented reality (AR) might be headed.

If you haven’t seen it yet give it a watch. Come back when you’re done and you’ve managed to get your breath back:

And if you think it’s just the fanbois who care, Robert Scoble asked Pixar co-founder Alvy Ray Smith what he thought:

Immersion and Game Play

Epic has demonstrated a level of realism and richness with Unreal 5 that is unparalleled in a video game. Under the hood, the advances will super-power development and allow creators to focus more on the creation part and less on optimization.

And honestly, it’s like my early dreams of computers come true.

My first game was an MS-DOS version of SimCity. I used to imagine what it would be like if I could walk around the city streets I created. I imagined a day when the worlds of computers would be as real as, well, reality.

It took me a years-long diversion into virtual worlds, and books like Raph Koster’s A Theory of Fun to realize that visual fidelity and enjoyment didn’t always go hand-in-hand. A lot of the avatars you’re walking around with these days don’t LOOK all that great, but they still give the user a sense of agency and presence.

Fortnite won’t win any realism awards, but it’s raking in the billions that are being used to supercharge Epic’s Metaverse ambitions.

Unreal and the Epic Juggernaut

Which brings us to Matthew Ball. Because any discussion of Unreal, or of Epic, should have his series as its companion reading:

Matthew proposes that Epic is on its way to being the next entertainment powerhouse, and that they are radically changing the dynamics of the game industry.

He views Tim Sweeney as a sort of Steve Jobs meets Walt Disney, a savant whose decades of vision is now being super-charged by the Fortnite money machine.

And within that vision, Unreal is just a component. And a component for which Epic won’t even make that much money:

Even if every single game used Unreal, from Flappy Bird clones on up to every single game from Activision Blizzard through to Nintendo, Epic would generate only $6B in revenue.”

He compares their strategy to Unity, which makes a per-seat license fee while Unreal is now waiving all fees until your game earns $1M:

“Now a small developer with tight budgets faces a unique risk–reward decision when picking an engine: Unity, with an upfront fee, or Unreal, which is free unless they’re enormously successful. Unity may choose to match Unreal’s waiver, but even if it doesn’t, Epic’s move will reduce the value of the engine industry overall.”

So what’s in it for Epic? If Unreal has a hard cap on how much money it can make, why would Epic bother? Why not churn out a bunch of Fortnite spin-offs instead?

Ball proposes that they’re focused on building something bigger: the Metaverse. And that in order to get there, they need creators.

Unreal isn’t about games. It’s about building the tool sets for a content creation explosion.

“Epic’s benefit is that we operate games. We have one of the biggest in the world, and we get more Fortnite friends as a result of it. We can also really help [lift] up the industry as a group of companies that cooperate together and collaborate together to reach users as opposed to fighting each other. The worst term that’s ever been invented in the history of the internet is ’own the customer.’ The customer owns themselves! I’m sorry; read the Magna Carta. Our aim is just to help all game developers do that in the way that we’ve done it with Fortnite.”Tim Sweeney

The Next Disney: Strategic Components

Unreal isn’t the only component in this strategy to build the ‘Metaverse’.

Epic also owns a game store which is going head-to-head with Steam and with the console stores. It owns the tools that let game developers add social features. It has the tools to help you make your game in the first place.

“Today, (Epic) operates one of the two most widely deployed and technically capable engines behind virtual worlds in gaming, film, and television,” says Ball. “In addition, Unreal is rapidly expanding into new virtual use cases, from concerts to theme parks, architecture to design. This not only gives Epic unique influence over the standards and pipelines of the future, but it also makes it much easier for future “worlds” and “experiences” to interconnect.” 

The only thing Epic is missing, really, is the end delivery mechanism. They don’t sell computers, or consoles, or phones or tablets. They have a massive fan base. They have titles. They just don’t own the devices on which you’ll view all this stuff.

Unreal and Augmented Reality

Unreal supports augmented reality. Unreal 4.25 added support for Hololens 2, Magic Leap, and Azure Spatial Anchors.

And so you can work in Unreal and deliver experiences to headsets or you can port your content through ARKit and ARCore to an Android or Apple phone or tablet.

The problem is that the ‘install base’ for rich AR is small. It isn’t just that Epic doesn’t ‘own’ the devices ecosystem on which games are delivered (like they don’t own Playstation), it’s that there’s no ecosystem to start with.

And so AR feels like an add-on. It’s like putting a high-powered engine into a golf cart.

And besides, most people use Unity. It does the trick. And Unity is already being used in VR and to make mobile games and so there’s an installation base of developers who know the tools.

Unreal is for the big guys. It’s for Triple A titles and games with a hundred million players. There are bigger fish to fry without worrying about AR.

Unreal 5: Solving The Tough Challenges of Our AR Future

Once you get past how stunning the graphics will look on your next generation Playstation, the Unreal 5 demo reveals that it’s solving a few insanely difficult problems.

And how they SOLVE those problems have massive implications for augmented reality.

Now, to be clear: these problems have been solved in the current demo through deep coupling between hardware (i.e. consoles) and software (i.e. Unreal).

And so it might not seem like there’s a direct line to AR.

But one of the key success-drivers for Epic is this: they are developing a platform where you can publish once to multiple devices.

Think about Fortnite: you can play it on a console or a PC, on a phone or tablet. Behind the scenes, the engine ‘renders’ the world at different levels of fidelity.

And this is absolutely critical to understanding Epic’s vision. Their technologies are being built so that they are adaptive to end platforms. Those lighting effects might look insanely awesome on a PS5, but they’ll still be present on lower-fidelity devices (they just won’t look as hot).

And so Unreal 5 might give some hints on how the engine could eventually roll-out into AR:

Lighting

One of the challenges with “rich AR” (which I define as the placement of 3D content into real-world views) is lighting. Images appear flat and out-of-place if they don’t match the light of the world in which they’re placed.

And so much of AR either does subtle lighting that is baked-in on 3D objects, or uses no lighting effect at all.

There are various work-arounds to this. You can use algorithms to detect lighting sources through your iPhone’s camera, for example, and then display a version of your 3D objects which matches. Or, instead of this ‘matching’ process, you can render your objects in the cloud.

Unreal is launching Lumen to handle how lighting is handled for games:

Lumen is a fully dynamic global illumination solution that immediately reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in huge, detailed environments, at scales ranging from kilometers to millimeters. Artists and designers can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling, and indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs—a huge time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

If they can port even some small degree of this capability to AR, where REAL light is the thing that a scene reacts to, then AR can start looking pretty stunning.

Polygon Counts, Oh My

When you’re developing 3D models for VR, AR or games, you’re always keeping track of polygons. And you may actually create multiple versions of an object for different resolutions.

When you’re approaching a tree in a game environment, you’ll see a “low resolution” version of the tree from a distance which will be replaced by higher and higher resolutions as you approach it.

This is a product of polygons (and textures) and is forced by limitations in how much your computer (or tablet) can process.

Unreal will launch something called Nanite to remove this burden entirely:

Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real time so there are no more polygon count budgets, polygon memory budgets, or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.

As this gets ported to AR, Nanite will mean the ability to create incredibly rich 3D objects and the engine will handle how to optimize them for different resolutions.

Animation Triggers

The Unreal team showed off another little feature which I could see be adapted for AR: programmatic animation. It may seem like a tiny little feature, but watch how the character puts her hand on the door:

And now imagine that in AR we detect that there’s a table in the room. And that we can then allow virtual characters to interact with that chair in a way that feels natural.

We can’t possibly capture animations for every type of chair, but Unreal hints at how animation can be smoothly handled programmatically.

A Multi-Platform Future

These features are being tightly coupled to the next generation of game consoles.

And yet they run parallel to a philosophy in which developing content should allow for adaptive presentation: when they create a new Fortnite chapter they don’t create multiple versions of the same thing.

Instead, the engine is doing more and more of the hard work of adapting the design work to the different end platforms (from phone to PC to console).

With the addition of real-time lighting management and the ability to import extremely high polygon models and let the engine sort out how to present them, Unreal is allowing developers to focus more on creation and less on optimization.

And against the backdrop of Epic’s big dreams, it’s not a stretch to say that we’re seeing a hint at where their ambitions might take them in AR.

We may still be years away from rich 3D capabilities in AR glasses. But Unreal might just be laying the groundwork for a Metaverse that you won’t just visit when you “log in”, but which will be ever-present in the world around you.

Subscribe to Out of Scope

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe