Book Image

Complete Virtual Reality and Augmented Reality Development with Unity

By : Jesse Glover, Jonathan Linowes
Book Image

Complete Virtual Reality and Augmented Reality Development with Unity

By: Jesse Glover, Jonathan Linowes

Overview of this book

Unity is the leading platform to develop mixed reality experiences because it provides a great pipeline for working with 3D assets. Using a practical and project-based approach, this Learning Path educates you about the specifics of AR and VR development using Unity 2018 and Unity 3D. You’ll learn to integrate, animate, and overlay 3D objects on your camera feed, before moving on to implement sensor-based AR applications. You’ll explore various concepts by creating an AR application using Vuforia for both macOS and Windows for Android and iOS devices. Next, you’ll learn how to develop VR applications that can be experienced with devices, such as Oculus and Vive. You’ll also explore various tools for VR development: gaze-based versus hand controller input, world space UI canvases, locomotion and teleportation, timeline animation, and multiplayer networking. You’ll learn the Unity 3D game engine via the interactive Unity Editor and C# programming. By the end of this Learning Path, you’ll be fully equipped to develop rich, interactive mixed reality experiences using Unity. This Learning Path includes content from the following Packt products: • Unity Virtual Reality Projects - Second Edition by Jonathan Linowes • Unity 2018 Augmented Reality Projects by Jesse Glover
Table of Contents (24 chapters)
Title Page
Copyright
About Packt
Contributors
Preface
Index

Polling for clicks


The simplest way to obtain user input is just get the current data from an input component. We've already seen this using the Input class and VR SDK. Presently, we will write our own input component that maps the Unity (or SDK) input to our own simple API in MyInputController. Then, we'll write a BalloonController that polls the input, as illustrated:

Our own button interface functions

You may recall that the MeMyselfEye player rig may have device-specific toolkit child objects for a particular VR SDK. The version for OpenVR, for example, has their [CameraRig]prefab. The version for Daydream has the DaydreamPlayerprefab. It makes sense to add our MyInputController component to MeMyselfEye, as it may make device-specific SDK calls. In this way, should you want to maintain camera rig prefabs for a variety of platforms, and swap them in and out as you build the project for a different VR target, the API that is exposed to the rest of your application will be consistent and...