Book Image

Hands-On Augmented Reality [Video]

By : Animorph Co-op
Book Image

Hands-On Augmented Reality [Video]

By: Animorph Co-op

Overview of this book

<p>Previous-generation AR apps were limited to events triggered by markers. Now experiences can be rooted in the user's space and context. This critical shift requires developers to update their skills as well as their thinking. It's time for you to go AR-native and build relevant, smart apps that people want to use for practical reasons. This course will make you fluent in innovative AR development and boost your out-of-the-box thinking to advance your career! </p><p>We will create apps people want to use and explore, build a robust portfolio, and deploy course projects to app stores. We will make an interactive speaking companion, a multiplayer app enabling you to collaborate on landscape design, and a VR-enabled AR environment customization app where you can alter the surfaces and lighting of your room! </p><p>Unity 3D is the undisputed leader in providing a scaffolding for complex AR/VR, which has been proven time and again with the native AR Foundation, and we will be using it throughout the course. This hands-on course will equip you with powerful tools to reimagine AR and turn it into reality. </p><p> </p><p>*Note: Unity version 2018.2 is used for this course. The setups for cloud anchors (second project) and split screen view (third project) are different in Unity 2019.2, however both concepts remain relevant and transferable. Respective videos within the course explain the evolution of the stack. </p><p> </p><p>The code bundle for this video course is available at - https://github.com/PacktPublishing/Hands-on-Augmented-Reality-V-</p>
Table of Contents (8 chapters)
Chapter 7
Plane Modifications using Gaze Mechanics
Content Locked
Section 4
Light Estimation and Shipping
In the previous section, we developed our color palette interface. In the final video of this section, we will enhance the effect by looking into the light estimation in AR Foundation. We will access light estimation data, and demonstrate how the value can modulate smoothness of materials and directional light intensity. We will also reposition the light using gaze. We will also build our app, and consider how we could harness what AR Foundation light estimation has to offer at this time. We are delighted to have delivered hands-free AR experience, though we recognize difficulties we have faced, while developing AR in the VR mode. It is great to wrestle with a setup, even if it soon goes out of date, because each time we take things apart, we learn about the underlying principles.