Book Image

Hands-On Meta Learning with Python

By : Sudharsan Ravichandiran
Book Image

Hands-On Meta Learning with Python

By: Sudharsan Ravichandiran

Overview of this book

Meta learning is an exciting research trend in machine learning, which enables a model to understand the learning process. Unlike other ML paradigms, with meta learning you can learn from small datasets faster. Hands-On Meta Learning with Python starts by explaining the fundamentals of meta learning and helps you understand the concept of learning to learn. You will delve into various one-shot learning algorithms, like siamese, prototypical, relation and memory-augmented networks by implementing them in TensorFlow and Keras. As you make your way through the book, you will dive into state-of-the-art meta learning algorithms such as MAML, Reptile, and CAML. You will then explore how to learn quickly with Meta-SGD and discover how you can perform unsupervised learning using meta learning with CACTUs. In the concluding chapters, you will work through recent trends in meta learning such as adversarial meta learning, task agnostic meta learning, and meta imitation learning. By the end of this book, you will be familiar with state-of-the-art meta learning algorithms and able to enable human-like cognition for your machine learning models.
Table of Contents (17 chapters)
Title Page
Dedication
About Packt
Contributors
Preface
Index

Chapter 1: Introduction to Meta Learning


  1. Meta learning produces a versatile AI model that can learn to perform various tasks without having to be trained from scratch. We train our meta learning model on various related tasks with a few data points, so for a new but related task, the model can make use of what it learned from the previous tasks without having to be trained from scratch. 
  2. Learning from fewer data points is called few-shot learning or k-shot learning, where k denotes the number of data points in each of the classes in the dataset.
  3. In order to make our model learn from a few data points, we will train it in the same way. So, when we have a dataset D, we sample some data points from each of the classes present in our dataset and we call it the support set. 
  4. We sample different data points from each of the classes that differ from the support set and call it the query set.
  5. In a metric-based meta learning setting, we will learn the appropriate metric space. Let's say we want to find out the similarities between two images. In a metric-based setting, we use a simple neural network, which extracts the features from the two images and finds the similarities by computing the distance between the features of those two images.
  6. We train our model in an episodic fashion; that is, in each episode, we sample a few data points from our dataset D, and prepare our support set and learn on the support set. So, over a series of episodes, our model will learn how to learn from a smaller dataset.