Book Image

Complete Virtual Reality and Augmented Reality Development with Unity

By : Jesse Glover, Jonathan Linowes
Book Image

Complete Virtual Reality and Augmented Reality Development with Unity

By: Jesse Glover, Jonathan Linowes

Overview of this book

Unity is the leading platform to develop mixed reality experiences because it provides a great pipeline for working with 3D assets. Using a practical and project-based approach, this Learning Path educates you about the specifics of AR and VR development using Unity 2018 and Unity 3D. You’ll learn to integrate, animate, and overlay 3D objects on your camera feed, before moving on to implement sensor-based AR applications. You’ll explore various concepts by creating an AR application using Vuforia for both macOS and Windows for Android and iOS devices. Next, you’ll learn how to develop VR applications that can be experienced with devices, such as Oculus and Vive. You’ll also explore various tools for VR development: gaze-based versus hand controller input, world space UI canvases, locomotion and teleportation, timeline animation, and multiplayer networking. You’ll learn the Unity 3D game engine via the interactive Unity Editor and C# programming. By the end of this Learning Path, you’ll be fully equipped to develop rich, interactive mixed reality experiences using Unity. This Learning Path includes content from the following Packt products: • Unity Virtual Reality Projects - Second Edition by Jonathan Linowes • Unity 2018 Augmented Reality Projects by Jesse Glover
Table of Contents (24 chapters)
Title Page
Copyright
About Packt
Contributors
Preface
Index

Summary


In this chapter, we explored a variety of software patterns for handling user input for your VR projects. The player uses a controller button to create, inflate, and release balloons into the scene. First, we tried the standard Input class for detecting logical button clicks, like the "Fire1" button, and then learned how to access device-specific SDK input, such as the OpenVR trigger button with haptic feedback. 

In our scene, we implemented a simple input component for polling the button actions. Then, we refactored the code to use scriptable objects to hold the input action data. In the third implementation, we used Unity Events to message input actions to listening components. We also enhanced the scene to attach the balloon to your virtual hand position, and added the ability to pop the balloons as explosive projectiles! Lastly, we used an interactable framework (for SteamVR and Daydream) to implement grabbing and throwing mechanics, using components provided in given toolkits...