Building gradient agreement algorithm with MAML
In the last section, we saw how the gradient agreement algorithm works. We saw how gradient agreement adds weights to the gradients implying their importance. Now, we'll see how to use our gradient agreement algorithm with MAML by coding them from scratch using NumPy. For better understanding, we'll consider a simple binary classification task. We'll randomly generate our input data, train it with a simple single-layer neural network, and try to find the optimal parameter θ.
Now we'll see step by step exactly how to do this.
You can also check out the whole code, available as a Jupyter Notebook here: https://github.com/sudharsan13296/Hands-On-Meta-Learning-With-Python/blob/master/08.%20Gradient%20Agreement%20As%20An%20Optimization%20Objective/8.4%20Building%20Gradient%20Agreement%20Algorithm%20with%20MAML.ipynb.
We import all of the necessary libraries:
import numpy as np
Generating data points
Now, we define a function called sample_points
for generating...