Book Image

Mastering Oculus Rift Development

By : Jack Donovan
Book Image

Mastering Oculus Rift Development

By: Jack Donovan

Overview of this book

Virtual reality (VR) is changing the world of gaming and entertainment as we know it. VR headsets such as the Oculus Rift immerse players in a virtual world by tracking their head movements and simulating depth, giving them the feeling that they are actually present in the environment. We will first use the Oculus SDK in the book and will then move on to the widely popular Unity Engine, showing you how you can add that extra edge to your VR games using the power of Unity. In this book, you’ll learn how to take advantage of this new medium by designing around each of its unique features. This book will demonstrate the Unity 5 game engine, one of most widely-used engines for VR development, and will take you through a comprehensive project that covers everything necessary to create and publish a complete VR experience for the Oculus Rift. You will also be able to identify the common perils and pitfalls of VR development to ensure that your audience has the most comfortable experience possible. By the end of the book, you will be able to create an advanced VR game for the Oculus Rift, and you’ll have everything you need to bring your ideas into a new reality.
Table of Contents (17 chapters)
Mastering Oculus Rift Development
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Common limitations of VR games


While VR provides the ability to immerse a player's senses like never before, it also creates some new, unique problems that must be addressed by responsible VR developers.

Locomotion sickness

VR headsets are meant to make you feel like you're somewhere else, and it only makes sense that you'd want to be able to explore that somewhere. Unfortunately, common game mechanics such as traditional joystick locomotion are problematic for VR. Our inner ears are accustomed to sensing inertia while we move from place to place, so if you were to push a joystick forward to walk in VR, your body would get confused when it sensed that you're still stationary.

Typically when there's a mismatch between what we're seeing and what we're feeling, our bodies assume that a nefarious poison or illness is at work, and they prepare to rid the body of the culprit; that's the motion sickness you feel when reading in a car, standing on a boat, and yes, moving in VR. This doesn't mean that we have to prevent users from moving in VR, we just might want to be more clever about it---more on that later.

Note

The primary cause of nausea with a traditional joystick movement in VR is acceleration and smooth movement; your brain gets confused easily when picking up speed or slowing down, and even constant smooth motion can cause nausea (car sickness, for instance)---rotation is even more complicated, because rotating smoothly creates discomfort almost immediately. Some developers get around this using hard increments instead of gradual motion, such as rotating in 30 degree "snaps" once per second instead of rotating smoothly.

Lack of real-world vision

One of the potentially clumsiest aspects of VR is getting your hands where they need to be without being able to see them. Whether you're using a gamepad, keyboard, or motion controller, you'll probably need to use your hands to interact with VR, which you can't see with an HMD sitting over your eyes. It's good practice to centralize input around resting positions (that is, the buttons naturally closest to your thumbs on a gamepad or the home row of a computer keyboard), but shy away from anything that requires complex precise input, such as writing sentences on a keyboard or hitting button combos on a controller.

Some VR headsets, such as the HTC Vive, have a forward-facing camera (sometimes called a passthrough camera) that users can choose to view in VR, enabling basic interaction with the real world without taking the headset off. The Oculus Rift doesn't feature a built-in camera, but you could still display the feed from an external camera on any surface in (we'll play with that idea later in the book).

Note

Even before modern VR, developers were creating applications that overlay smart information over what a camera is seeing; that's called augmented reality (AR). Experiences that ride the line between camera output and virtual environments are called mixed reality (MR).

Unnatural head movements

You may not have thought about it before, but looking around in a traditional first-person shooter (FPS) is quite different than looking around using your head. The right analog stick is typically used to direct the camera and make adjustments as necessary, but in VR, players actually move their head instead of using their thumb to move their virtual head. Don't expect players in VR to be able to make the same snappy pivots and 180-degree turns on a dime that are simple in a regular console game.

Vergence-accommodation conflict

Another limitation to consider when designing your VR game is what's called a vergence-accommodation conflict. This is basically what happens when the distance to the point of your focus, or vergence (that is, an object in VR), is notably different than your focal distance, or where your eyes are actually focusing on the screen in front of you.

This image from a research article in the Journal of Vision demonstrates the conflicting difference:

Forcing the user to focus on objects that are too close or too far away for extended periods of time can cause symptoms of eye fatigue, including sore eyes or headaches. Therefore, it's important to consider placement of the pieces of your game that will draw a lot of attention.

The full article, titled Vergence-accommodation conflicts hinder visual performance and cause visual fatigue by David M. Hoffman, Ahna R. Girshick, Kurt Akeley, and Martin S. Banks, is available at http://jov.arvojournals.org/article.aspx?articleid=2122611. It is a valuable resource in avoiding depth cues that may cause eye fatigue.