Book Image

Learn ARCore - Fundamentals of Google ARCore

Book Image

Learn ARCore - Fundamentals of Google ARCore

Overview of this book

Are you a mobile developer or web developer who wants to create immersive and cool Augmented Reality apps with the latest Google ARCore platform? If so, this book will help you jump right into developing with ARCore and will help you create a step by step AR app easily. This book will teach you how to implement the core features of ARCore starting from the fundamentals of 3D rendering to more advanced concepts such as lighting, shaders, Machine Learning, and others. We’ll begin with the basics of building a project on three platforms: web, Android, and Unity. Next, we’ll go through the ARCore concepts of motion tracking, environmental understanding, and light estimation. For each core concept, you’ll work on a practical project to use and extend the ARCore feature, from learning the basics of 3D rendering and lighting to exploring more advanced concepts. You’ll write custom shaders to light virtual objects in AR, then build a neural network to recognize the environment and explore even grander applications by using ARCore in mixed reality. At the end of the book, you’ll see how to implement motion tracking and environment learning, create animations and sounds, generate virtual characters, and simulate them on your screen.
Table of Contents (17 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Estimating light direction


Google provides us with a robust solution for estimating the amount of light in an AR scene with ARCore. As we learned, light direction is an equally important part of scene lighting. Google didn't intentionally ignore estimating light direction with ARCore; it's just that that problem is really difficult to do right. However, Google did provide us with just enough tools in ARCore to be able to estimate light direction, providing some simple assumptions. First, we need to assume that our user, for now anyway, will remain in the same room or area. Second, our user will need to look in at least an 180 degree arc across their vision, or more simply put, the user just needs to look around. Third, it works best if the real-world environment is lit from a distant single bright source, such as the sun. Based on those assumptions, we can simply store the direction the user saw the brightest image in and use that to reverse calculate our light direction. This may sound...