Book Image

Unreal Engine 4 Virtual Reality Projects

By : Kevin Mack, Robert Ruud
Book Image

Unreal Engine 4 Virtual Reality Projects

By: Kevin Mack, Robert Ruud

Overview of this book

Unreal Engine 4 (UE4) is a powerful tool for developing VR games and applications. With its visual scripting language, Blueprint, and built-in support for all major VR headsets, it's a perfect tool for designers, artists, and engineers to realize their visions in VR. This book will guide you step-by-step through a series of projects that teach essential concepts and techniques for VR development in UE4. You will begin by learning how to think about (and design for) VR and then proceed to set up a development environment. A series of practical projects follows, taking you through essential VR concepts. Through these exercises, you'll learn how to set up UE4 projects that run effectively in VR, how to build player locomotion schemes, and how to use hand controllers to interact with the world. You'll then move on to create user interfaces in 3D space, use the editor's VR mode to build environments directly in VR, and profile/optimize worlds you've built. Finally, you'll explore more advanced topics, such as displaying stereo media in VR, networking in Unreal, and using plugins to extend the engine. Throughout, this book focuses on creating a deeper understanding of why the relevant tools and techniques work as they do, so you can use the techniques and concepts learned here as a springboard for further learning and exploration in VR.
Table of Contents (20 chapters)
Title Page
About Packt
Contributors
Preface
12
Where to Go from Here
Index

What is virtual reality?


Let's start at the beginning, and talk about virtual reality itself. VR, at its most basic level, is a medium that immerses users into a simulated world, allowing them to see, hear, and interact with an environment and things within this environment that don't actually exist in the physical world around them. Users are fully surrounded by this experience, an effect that VR developers call immersion. Users who are immersed in a space can look around and often move and interact without ever breaking the illusion that they're actually there. Immersion, as we're going to see shortly, is fundamental to the way VR works.

Rob Ruud testing an early build of Ludicrous Speed using an HTC Vive headset

Note

Immersion in VR is a term used to describe a VR system's ability to surround the user with the simulated world. They can look around and, in many cases, move and interact as though they were really there, and because the actual environment is blocked out by the headset, they're given few conflicting cues to remind them that they aren't.

VR hardware

The most common way of immersing a user, and the one we'll be talking about in this book, is through the use of a Head-Mounted Display (HMD), often just referred to as a headset. (There are other ways of doing VR—projecting images on walls, for example, but in this book, we focus on head-mounted VR.) The user's headset displays the virtual world and tracks the movement of their head to rotate and shift the view to create the illusion that they're actually looking around and moving through physical space. Some headsets, though not all of them, include headphones to add to the illusion by enabling sounds in the environment to sound as though they're coming from their sources in the virtual world through a process called spatialized audio

Note

You'll see the termsHMD andheadset used interchangeably throughout this book and in other writing on VR. They all refer to the same thing.

Some headsets only track the direction the user is looking, while others can track changes to the user's position as well. If you're using a headset that tracks rotation but not position, and you lean forward to try to look more closely at an object, nothing's going to happen. The object will seem as though it's moving away from you as you try to lean in toward it. If you do this on a headset that tracks position as well, your virtual head will move closer to the object. We use the term Degrees of Freedom (DoF) to describe the ways objects can move in space. (Yes, it's OK to pronounce it doff. All of the developers do.) Take a look at the following points:

  • 3DOF: A device that tracks rotation but doesn't track position is commonly called a 3DoF device because it only tracks the three degrees of freedom that describe rotation: the degree to which the device is leaning to the side (roll), tilting forward (pitch), or turning sideways (yaw). Up until recently, all mobile VR headsets were 3DoF devices, as they used Inertial Measurement Units (IMUs) similar to those found in cellphones to detect rotation, but had no way to know where they were in space. The Oculus Go and Samsung Gear headsets are examples of 3DoF devices.
  • 6DOF: A device that tracks position as well as the rotation is a 6DoF device, because it's tracking all six degrees of freedom—roll, pitch, and yaw, but also up and down, side-to-side, and forward or backward movement. Tracking an object's position in space requires you to have a fixed reference point from which you can describe its motion. Most first-generation systems needed additional hardware for this. The Lighthouse base stations for the HTC Vive, or the Constellation cameras for the Oculus Rift provide this postional tracking on desktop systems. Windows Mixed Reality headsets and standalone headsets such as the Oculus Quest and Vive Focus use camera arrays on the headset to track the headset's position in the room (we call this inside-outtracking), so they don't require external cameras or base stations. The HTC Vive, Oculus Rift, HTC Vive Focus, Oculus Quest, and Windows Mixed Reality headsets are 6DoF devices.

Note

3DoF devices track rotation only, so users can look around or point, but can't move from side-to-side. 6DoF devices track position as well as rotation, so users can not only look around, but can move as well.

Headsets can either be tethered to a computer—as is the case with the Oculus Rift and the HTC Vive, which allows the full computing power of the attached PC to drive the visuals – or they can be self-contained devices such as the Samsung Gear, Oculus Go, Oculus Quest, and HTC Vive Focus. At the time of this writing, wireless connections between PCs and VR headsets are beginning to enter the market.

Most headsets also come paired with input devices that allow users to interact with the world, which can act as pointers or as hands. Handheld devices, as with headsets, can be tracked in three or six degrees of freedom. 3DoF devices such as the Oculus Go's controller are essentially pointers—users can aim them but can't reach out and grab something. 6DoF devices act much more like virtual hands and allow users to interact with the world in a much greater variety of ways.

VR isn't just about hardware though

One of the major mistakes many new developers make when first approaching VR is that they try to apply the traditional designs they're used to creating in 2D space to the VR space and, for the most part, this doesn't work. VR is its own medium, and it doesn't follow the same rules as the media that came before it. It's worth it to take a moment to look at what this means.

When most people first consider VR, they see the headset and assume that it's primarily a visual experience—traditional flat-screen media shown in stereo. It's understandable that it would seem this way, but their perception misses the point. Yes, the VR headset is (depending on whether or not it includes integrated audio) either primarily or entirely a display device, but the experience it creates for the user is very different than the experience created by a traditional flat screen.

Let's imagine for a minute that you're looking at a photo or a 2D video looking down over the edge of a tall building. You see the streets far below, but they don't really feel as though they're far below you. They're just small in the image. Take the same image, but now present it in stereo through a VR headset, and you'll probably experience vertigo. Why is this? Take a look at the following screenshot:

Non-immersive media, no matter how large or detailed, still leaves the viewer surrounded by reminders that the scene isn't real. Immersive media, on the other hand, seems to surround the user completely. (Scene: Soul:City Environment Pack, by Epic Games)

First, as we mentioned a moment ago, you're immersed in the experience. There's nothing else in the surrounding world to remind you that it isn't real. Let's jump back to our previous example—the building edge on your television—turn around and look behind you. Oh. You're just in your living room. Even when you look directly at it, the largest television you could possibly buy still leaves you with lots of peripheral vision to remind you that what you're seeing there isn't real. Everything on a flat screen, even a 3D screen, takes place on the other side of a window. You're watching, but you're not really there. In VR, the window is gone. When you look to the right, the world is still there. Look behind you, and you're still in it. Your perception is completely overtaken by an experience that has become an environment, not just a frame you're looking at.

Second, the stereo image creates a sense of real depth. You can see how far down the drop really goes. The cars in the street below aren't just small, they're far away. In a 6DoF headset that allows motion tracking, your movements in the real world are mirrored in the virtual world. You can lean over the edge or step back. This mixture of immersion, real depth perception, and natural response to your movement comes together to convince your body that what you're perceiving is real. We call this phenomenon presence, and it's a sensation that's mostly experienced physically.

Note

Presence in VR refers to the user feeling that they're actually physically in the virtual world, responding to the environment as though they were really there and experiencing these things. Creating an experience of presence is what VR is all about—this is the major thing it can do that other media can't.

The mechanics of immersion and the resulting experience of presence are unique to VR. No other medium does this.

Note

When reading about VR, you'll sometimes see the terms presence and immersion used interchangeably, but it's generally more clear to think about presence as the goal—the sensation you're trying to create in the user, and immersion as the mechanism by which we achieve it.

Presence is tough to achieve

While we're on the topic of presence, it's worth pointing out that it's a fragile phenomenon, and the current state of VR technology still faces a few challenges to creating a sense of presence fully and reliably. Some of these are rooted in hardware and are almost certainly going to go away as the technology advances. Users can feel the headset on their face, for example, and on wired headsets, they can feel the cable running from the headset. The current generation of headsets offers a field of view that's too narrow to provide peripheral vision. (The desktop devices offer a 110° field of view, but your eyes, meanwhile, can perceive a field twice as wide.) Display resolutions aren't yet high enough to keep users from being able to see individual pixels (VR users call this the screen door effect), and finicky optics can blur the user's vision if they're not perfectly aligned. This means, in practice, that it's hard to read small text on a VR headset, and that users are sometimes reminded of the hardware when they have to adjust it to get back into the sweet spot for the lenses.

Looking at the state of things, though, it's obvious that these hardware challenges won't last forever. Self-contained and wireless headsets are quickly entering the market, with increasingly reliable tracking that no longer relies on external equipment. Displays are getting wider, resolutions are getting higher, and optical waveguides show great promise for lighter displays with wider in-focus regions. VR works extremely well already, and it's easy to see how it's going to continue to improve.

There are a few other things that can break presence that we can't do as much about—hitting a desk accidentally with a controller, for example, or running into furniture, losing tracking, or hearing sounds from outside the experience. We can manage these when we have control over the user's space, but where we don't, there's not much we can do.

Even given these limitations, though, think about how profoundly the current generation of VR can create a sense of presence in a user, and realize that it only gets better from here. Users believe what they experience in VR to a degree that simply doesn't happen with other media. They explore and learn in ways that aren't possible in any other way. They empathize and connect with people and places more deeply than they could in any way, other than physically being there. Nothing else goes as deep. And we're only getting started.