Book Image

Deep Learning with PyTorch

By : Vishnu Subramanian
Book Image

Deep Learning with PyTorch

By: Vishnu Subramanian

Overview of this book

Deep learning powers the most intelligent systems in the world, such as Google Voice, Siri, and Alexa. Advancements in powerful hardware, such as GPUs, software frameworks such as PyTorch, Keras, TensorFlow, and CNTK along with the availability of big data have made it easier to implement solutions to problems in the areas of text, vision, and advanced analytics. This book will get you up and running with one of the most cutting-edge deep learning libraries—PyTorch. PyTorch is grabbing the attention of deep learning researchers and data science professionals due to its accessibility, efficiency and being more native to Python way of development. You'll start off by installing PyTorch, then quickly move on to learn various fundamental blocks that power modern deep learning. You will also learn how to use CNN, RNN, LSTM and other networks to solve real-world problems. This book explains the concepts of various state-of-the-art deep learning architectures, such as ResNet, DenseNet, Inception, and Seq2Seq, without diving deep into the math behind them. You will also learn about GPU computing during the course of the book. You will see how to train a model with PyTorch and dive into complex neural networks such as generative networks for producing text and images. By the end of the book, you'll be able to implement deep learning applications in PyTorch with ease.
Table of Contents (11 chapters)

Language modeling

We will learn how to teach a recurrent neural network (RNN) how it can create a sequence of text. In simple words, the RNN model that we will build now will be able to predict the next word, given some context. This is just like the Swift app on your phone, which guesses the next word that you are typing. The ability to generate sequential data has applications in many different areas, such as:

  • Image captioning
  • Speech recognition
  • Language translation
  • Automatic email reply

We learnt in Chapter 6, Deep Learning with Sequence Data and Text, that RNNs are tough to train. So, we will be using a variant of RNN called Long Short-Term Memory (LSTM). The development of the LSTM algorithm started in 1997 but became popular in the last few years. It became popular due to the availability of powerful hardware and quality data, and some advancements such as dropout also...