Book Image

Augmented Reality with Unity AR Foundation

By : Jonathan Linowes
2 (1)
Book Image

Augmented Reality with Unity AR Foundation

2 (1)
By: Jonathan Linowes

Overview of this book

Augmented reality applications allow people to interact meaningfully with the real world through digitally enhanced content. The book starts by helping you set up for AR development, installing the Unity 3D game engine, required packages, and other tools to develop for Android (ARCore) and/or iOS (ARKit) mobile devices. Then we jump right into the building and running AR scenes, learning about AR Foundation components, other Unity features, C# coding, troubleshooting, and testing. We create a framework for building AR applications that manages user interaction modes, user interface panels, and AR onboarding graphics that you will save as a template for reuse in other projects in this book. Using this framework, you will build multiple projects, starting with a virtual photo gallery that lets you place your favorite framed photos on your real-world walls, and interactively edit these virtual objects. Other projects include an educational image tracking app for exploring the solar system, and a fun selfie app to put masks and accessories on your face. The book provides practical advice and best practices that will have you up and running quickly. By the end of this AR book, you will be able to build your own AR applications, engaging your users in new and innovative ways.
Table of Contents (14 chapters)
Section 1 – Getting Started with Augmented Reality
Section 2 – A Reusable AR User Framework
Section 3 – Building More AR Projects

Understanding AR interaction flow

In an Augmented Reality application, one of the first things the user must do is scan the environment with the device camera, slowly moving their device around until it detects geometry for tracking. This might be horizontal planes (floor, tabletop), vertical planes (walls), a human face, or other objects. A simplistic user flow given in many example scenes is shown in the following diagram:

Figure 4.1 – A simple AR onboarding user workflow

As shown in the preceding diagram, the app starts by checking for AR support, asking the user for permission to access the device camera and other initializations. Then, the app asks the user to scan the environment for trackable objects, and may need to report scanning problems, such as if the room is too dark or there's not enough texture to detect features. Once tracking is achieved, the user is prompted to tap the screen to place a virtual object in the scene.

This is...