Book Image

Hands-On Graph Neural Networks Using Python

By : Maxime Labonne
Book Image

Hands-On Graph Neural Networks Using Python

By: Maxime Labonne

Overview of this book

Graph neural networks are a highly effective tool for analyzing data that can be represented as a graph, such as networks, chemical compounds, or transportation networks. The past few years have seen an explosion in the use of graph neural networks, with their application ranging from natural language processing and computer vision to recommendation systems and drug discovery. Hands-On Graph Neural Networks Using Python begins with the fundamentals of graph theory and shows you how to create graph datasets from tabular data. As you advance, you’ll explore major graph neural network architectures and learn essential concepts such as graph convolution, self-attention, link prediction, and heterogeneous graphs. Finally, the book proposes applications to solve real-life problems, enabling you to build a professional portfolio. The code is readily available online and can be easily adapted to other datasets and apps. By the end of this book, you’ll have learned to create graph datasets, implement graph neural networks using Python and PyTorch Geometric, and apply them to solve real-world problems, along with building and training graph neural network models for node and graph classification, link prediction, and much more.
Table of Contents (25 chapters)
1
Part 1: Introduction to Graph Learning
5
Part 2: Fundamentals
10
Part 3: Advanced Techniques
18
Part 4: Applications
22
Chapter 18: Unlocking the Potential of Graph Neural Networks for Real-World Applications

Summary

In this chapter, we learned about the missing link between vanilla neural networks and GNNs. We built our own GNN architecture using our intuition and a bit of linear algebra. We explored two popular graph datasets from the scientific literature to compare our two architectures. Finally, we implemented them in PyTorch and evaluated their performance. The result is clear: even our intuitive version of a GNN completely outperforms the MLP on both datasets.

In Chapter 6, Normalizing Embeddings with Graph Convolutional Networks, we refine our vanilla GNN architecture to correctly normalize its inputs. This graph convolutional network model is an incredibly efficient baseline we’ll keep using in the rest of the book. We will compare its results on our two previous datasets and introduce a new interesting task: node regression.