Book Image

Mastering Oculus Rift Development

By : Jack Donovan
Book Image

Mastering Oculus Rift Development

By: Jack Donovan

Overview of this book

Virtual reality (VR) is changing the world of gaming and entertainment as we know it. VR headsets such as the Oculus Rift immerse players in a virtual world by tracking their head movements and simulating depth, giving them the feeling that they are actually present in the environment. We will first use the Oculus SDK in the book and will then move on to the widely popular Unity Engine, showing you how you can add that extra edge to your VR games using the power of Unity. In this book, you’ll learn how to take advantage of this new medium by designing around each of its unique features. This book will demonstrate the Unity 5 game engine, one of most widely-used engines for VR development, and will take you through a comprehensive project that covers everything necessary to create and publish a complete VR experience for the Oculus Rift. You will also be able to identify the common perils and pitfalls of VR development to ensure that your audience has the most comfortable experience possible. By the end of the book, you will be able to create an advanced VR game for the Oculus Rift, and you’ll have everything you need to bring your ideas into a new reality.
Table of Contents (17 chapters)
Mastering Oculus Rift Development
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Summary


In this chapter, you brought sound to your game environment using a few different methods. Good audio is what holds VR environments together, and the importance of realistic sound in immersion is not to be underestimated.

We began with simple 2D audio, which is fast and easy to implement but really only good for audio scenarios in which location doesn't matter, such as menu sounds. We then modified Unity's Audio Source component in order to spatialize it and create basic stereophonic 3D audio.

Stereophonic audio can help players locate the origin of sounds around them, but it can't be used to determine whether the sound is coming from above or below the player because the "head" in the audio calculation is completely static and the ears are always in the same relative orientation.

We then added another dimension of realism to our simple 3D audio by implementing HRTFs, or head-related transfer functions. HRTFs are only possible with head-mounted devices because they take head/ear rotation...