Book Image

Learn Unity ML-Agents ??? Fundamentals of Unity Machine Learning

Book Image

Learn Unity ML-Agents ??? Fundamentals of Unity Machine Learning

Overview of this book

Unity Machine Learning agents allow researchers and developers to create games and simulations using the Unity Editor, which serves as an environment where intelligent agents can be trained with machine learning methods through a simple-to-use Python API. This book takes you from the basics of Reinforcement and Q Learning to building Deep Recurrent Q-Network agents that cooperate or compete in a multi-agent ecosystem. You will start with the basics of Reinforcement Learning and how to apply it to problems. Then you will learn how to build self-learning advanced neural networks with Python and Keras/TensorFlow. From there you move o n to more advanced training scenarios where you will learn further innovative ways to train your network with A3C, imitation, and curriculum learning models. By the end of the book, you will have learned how to build more complex environments by building a cooperative and competitive multi-agent ecosystem.
Table of Contents (8 chapters)

Summary

This has been an exciting chapter and we have been able to play with several variations of training scenarios. We started by looking at extending our training to multi-agent environments that still used a single brain. Next, we looked at a variation of multi-agent training called Adversarial self-play, that allows us to train pairs of agents using a system of inverse rewards. Then, we covered how an agent can be configured to make decisions at a specific frequency or even on demand. After that, we looked at another novel method of training called Imitation Learning. This training scenario allowed us to play and, at the same time, teach an agent to play tennis. Finally, we completed the chapter with another training technique called Curriculum Learning, which allowed us to gradually increase the complexity of an agent's training over time.

In this chapter, we played...