Book Image

Hands-On Augmented Reality [Video]

By : Animorph Co-op
Book Image

Hands-On Augmented Reality [Video]

By: Animorph Co-op

Overview of this book

<p>Previous-generation AR apps were limited to events triggered by markers. Now experiences can be rooted in the user's space and context. This critical shift requires developers to update their skills as well as their thinking. It's time for you to go AR-native and build relevant, smart apps that people want to use for practical reasons. This course will make you fluent in innovative AR development and boost your out-of-the-box thinking to advance your career! </p><p>We will create apps people want to use and explore, build a robust portfolio, and deploy course projects to app stores. We will make an interactive speaking companion, a multiplayer app enabling you to collaborate on landscape design, and a VR-enabled AR environment customization app where you can alter the surfaces and lighting of your room! </p><p>Unity 3D is the undisputed leader in providing a scaffolding for complex AR/VR, which has been proven time and again with the native AR Foundation, and we will be using it throughout the course. This hands-on course will equip you with powerful tools to reimagine AR and turn it into reality. </p><p> </p><p>*Note: Unity version 2018.2 is used for this course. The setups for cloud anchors (second project) and split screen view (third project) are different in Unity 2019.2, however both concepts remain relevant and transferable. Respective videos within the course explain the evolution of the stack. </p><p> </p><p>The code bundle for this video course is available at - https://github.com/PacktPublishing/Hands-on-Augmented-Reality-V-</p>
Table of Contents (8 chapters)
Chapter 6
Project 3 – Experiencing Smart AR in VR Mode
Content Locked
Section 2
Combining AR with VR
In the last video, we have gained grounding in the wearable realm. In this lesson, we will proceed to implement a VR build in Unity, using ARFoundation. If we merely enable Virtual Reality mode in Unity Player settings, we end up with a distorted image, which could easily give the users a nausea, while wearing the headset. Also, the tracking seems way off point. Our approach focuses on running rendering into two screens, with the use of distortion shader embedded in an open-source repo. By the end of this video we will have a working Mixed Reality app!