Book Image

Cardboard VR Projects for Android

By : Jonathan Linowes, Matt Schoen
Book Image

Cardboard VR Projects for Android

By: Jonathan Linowes, Matt Schoen

Overview of this book

Google Cardboard is a low-cost, entry-level media platform through which you can experience virtual reality and virtual 3D environments. Its applications are as broad and varied as mobile smartphone applications themselves. This book will educate you on the best practices and methodology needed to build effective, stable, and performant mobile VR applications. In this book, we begin by defining virtual reality (VR) and how Google Cardboard fits into the larger VR and Android ecosystem. We introduce the underlying scientific and technical principles behind VR, including geometry, optics, rendering, and mobile software architecture. We start with a simple example app that ensures your environment is properly set up to write, build, and run the app. Then we develop a reusable VR graphics engine that you can build upon. And from then on, each chapter is a self-contained project where you will build an example from a different genre of application, including a 360 degree photo viewer, an educational simulation of our solar system, a 3D model viewer, and a music visualizer. Given the recent updates that were rolled out at Google I/O 2016, the authors of Cardboard VR Projects for Android have collated some technical notes to help you execute the projects in this book with Google VR Cardboard Java SDK 0.8, released in May 2016. Refer to the article at https://www.packtpub.com/sites/default/files/downloads/GoogleVRUpdateGuideforCardbook.pdf which explains the updates to the source code of the projects.
Table of Contents (16 chapters)
Cardboard VR Projects for Android
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Detect looking at objects


Wait, there's more! Just one more thing to add. Building interactive applications require us to be able to determine whether the user is gazing at a specific object. We can put this into RenderObject, so any objects in the scene can be gaze detected.

The technique that we'll implement is straightforward. Considering each object we render is projected onto a camera plane, we really only need to determine whether the user is looking at the object's plane. Basically, we check whether the vector between the camera and the plane position is the same as the camera's view direction. But we'll throw in some tolerance, so you don't have to look exactly at the center of the plane (that'd be impractical). We will check a narrow range. A good way to do this is to calculate the angle between the vectors. We calculate the pitch and yaw angles between these vectors (the up/down X axis angle and left/right Y axis angle, respectively). Then, we check whether these angles are within...