Book Image

Unity 2020 Virtual Reality Projects - Third Edition

By : Jonathan Linowes
Book Image

Unity 2020 Virtual Reality Projects - Third Edition

By: Jonathan Linowes

Overview of this book

This third edition of the Unity Virtual Reality (VR) development guide is updated to cover the latest features of Unity 2019.4 or later versions - the leading platform for building VR games, applications, and immersive experiences for contemporary VR devices. Enhanced with more focus on growing components, such as Universal Render Pipeline (URP), extended reality (XR) plugins, the XR Interaction Toolkit package, and the latest VR devices, this edition will help you to get up to date with the current state of VR. With its practical and project-based approach, this book covers the specifics of virtual reality development in Unity. You'll learn how to build VR apps that can be experienced with modern devices from Oculus, VIVE, and others. This virtual reality book presents lighting and rendering strategies to help you build cutting-edge graphics, and explains URP and rendering concepts that will enable you to achieve realism for your apps. You'll build real-world VR experiences using world space user interface canvases, locomotion and teleportation, 360-degree media, and timeline animation, as well as learn about important VR development concepts, best practices, and performance optimization and user experience strategies. By the end of this Unity book, you'll be fully equipped to use Unity to develop rich, interactive virtual reality experiences.
Table of Contents (15 chapters)

Summary

In this chapter, we explored a variety of software patterns for handling user input for your VR projects. The player uses a controller button, the trigger, to create, inflate, and release balloons into the scene. First, we tried the standard Input class for detecting logical button clicks, like the XRI_Right_TriggerButton button, and implemented it using a polling design pattern. Then we replaced that with Unity events instead of polling, decoupling our BalloonController script from the input itself. Later, this was even more important when we used the XR Interaction Toolkit's Interactor events to implement the same mechanic.

We learned about the XR Interaction Toolkit and its Interactor/Interactable design pattern. We saw how the XR Rig's hand controllers are the Interactors in the scene. We also created Interactables, including the balloon gun and the ball projectile, that you can grab, activate, and throw. We learned how to wire into the Interaction...