Book Image

Deep Learning with PyTorch Lightning

By : Kunal Sawarkar
3.5 (2)
Book Image

Deep Learning with PyTorch Lightning

3.5 (2)
By: Kunal Sawarkar

Overview of this book

Building and implementing deep learning (DL) is becoming a key skill for those who want to be at the forefront of progress.But with so much information and complex study materials out there, getting started with DL can feel quite overwhelming. Written by an AI thought leader, Deep Learning with PyTorch Lightning helps researchers build their first DL models quickly and easily without getting stuck on the complexities. With its help, you’ll be able to maximize productivity for DL projects while ensuring full flexibility – from model formulation to implementation. Throughout this book, you’ll learn how to configure PyTorch Lightning on a cloud platform, understand the architectural components, and explore how they are configured to build various industry solutions. You’ll build a neural network architecture, deploy an application from scratch, and see how you can expand it based on your specific needs, beyond what the framework can provide. In the later chapters, you’ll also learn how to implement capabilities to build and train various models like Convolutional Neural Nets (CNN), Natural Language Processing (NLP), Time Series, Self-Supervised Learning, Semi-Supervised Learning, Generative Adversarial Network (GAN) using PyTorch Lightning. By the end of this book, you’ll be able to build and deploy DL models with confidence.
Table of Contents (15 chapters)
1
Section 1: Kickstarting with PyTorch Lightning
6
Section 2: Solving using PyTorch Lightning
11
Section 3: Advanced Topics

Summary

Transfer learning is one of the most common ways used to cut compute costs, save time, and get the best results. In this chapter, we learned how to build models with ResNet-50 and pre-trained BERT architectures using PyTorch Lightning.

We have built an image classifier and a text classifier, and along the way, we have covered some useful PyTorch Lightning life cycle methods. We have learned how to make use of pre-trained models to work on our customized datasets with less effort and a smaller number of training epochs. Even with very little model tuning, we were able to achieve decent accuracy.

While transfer learning methods work great, their limitations should also be borne in mind. They work incredibly well for language models because the given dataset's text is usually made up of the same English words as in your core training set. When the core training set is very different from your given dataset, performance suffers. For example, if you want to build an image...