Book Image

Hands-On Graph Neural Networks Using Python

By : Maxime Labonne
Book Image

Hands-On Graph Neural Networks Using Python

By: Maxime Labonne

Overview of this book

Graph neural networks are a highly effective tool for analyzing data that can be represented as a graph, such as networks, chemical compounds, or transportation networks. The past few years have seen an explosion in the use of graph neural networks, with their application ranging from natural language processing and computer vision to recommendation systems and drug discovery. Hands-On Graph Neural Networks Using Python begins with the fundamentals of graph theory and shows you how to create graph datasets from tabular data. As you advance, you’ll explore major graph neural network architectures and learn essential concepts such as graph convolution, self-attention, link prediction, and heterogeneous graphs. Finally, the book proposes applications to solve real-life problems, enabling you to build a professional portfolio. The code is readily available online and can be easily adapted to other datasets and apps. By the end of this book, you’ll have learned to create graph datasets, implement graph neural networks using Python and PyTorch Geometric, and apply them to solve real-world problems, along with building and training graph neural network models for node and graph classification, link prediction, and much more.
Table of Contents (25 chapters)
1
Part 1: Introduction to Graph Learning
5
Part 2: Fundamentals
10
Part 3: Advanced Techniques
18
Part 4: Applications
22
Chapter 18: Unlocking the Potential of Graph Neural Networks for Real-World Applications

Summary

In this chapter, we introduced a new essential architecture: the GAT. We saw its inner workings with four main steps, from linear transformation to multi-head attention. We saw how it works in practice by implementing a graph attention layer in NumPy. Finally, we applied a GAT model (with GATv2) to the Cora and CiteSeer datasets, where it provided excellent accuracy scores. We showed that these scores were dependent on the number of neighbors, which is a first step toward error analysis.

In Chapter 8, Scaling Graph Neural Networks with GraphSAGE, we will introduce a new architecture dedicated to managing large graphs. To test this claim, we will implement it on a new dataset several times bigger than what we’ve seen so far. We will talk about transductive and inductive learning, which is an important distinction for GNN practitioners.