Book Image

Mastering PyTorch

By : Ashish Ranjan Jha
Book Image

Mastering PyTorch

By: Ashish Ranjan Jha

Overview of this book

Deep learning is driving the AI revolution, and PyTorch is making it easier than ever before for anyone to build deep learning applications. This PyTorch book will help you uncover expert techniques to get the most out of your data and build complex neural network models. The book starts with a quick overview of PyTorch and explores using convolutional neural network (CNN) architectures for image classification. You'll then work with recurrent neural network (RNN) architectures and transformers for sentiment analysis. As you advance, you'll apply deep learning across different domains, such as music, text, and image generation using generative models and explore the world of generative adversarial networks (GANs). You'll not only build and train your own deep reinforcement learning models in PyTorch but also deploy PyTorch models to production using expert tips and techniques. Finally, you'll get to grips with training large models efficiently in a distributed manner, searching neural architectures effectively with AutoML, and rapidly prototyping models using PyTorch and fast.ai. By the end of this PyTorch book, you'll be able to perform complex deep learning tasks using PyTorch to build smart artificial intelligence models.
Table of Contents (20 chapters)
1
Section 1: PyTorch Overview
4
Section 2: Working with Advanced Neural Network Architectures
8
Section 3: Generative Models and Deep Reinforcement Learning
13
Section 4: PyTorch in Production Systems

Using Optuna for hyperparameter search

Optuna is one of the hyperparameter search tools that supports PyTorch. You can read in detail about the search strategies used by the tool, such as TPE (Tree-Structured Parzen Estimation) and CMA-ES (Covariance Matrix Adaptation Evolution Strategy) in the Optuna paper, at https://arxiv.org/pdf/1907.10902.pdf. Besides the advanced hyperparameter search methodologies, the tool provides a sleek API, which we will explore in a moment.

Tool citation

Optuna: A Next-Generation Hyperparameter Optimization Framework.

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama (2019, in KDD).

In this section, we will once again build and train the MNIST model, this time using Optuna to figure out the optimal hyperparameter setting. We will discuss important parts of the code step by step, in the form of an exercise. The full code can be found here:

https://github.com/PacktPublishing/Mastering-PyTorch/blob/master/Chapter12...