Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying The Deep Learning with PyTorch Workshop
  • Table Of Contents Toc
The Deep Learning with PyTorch Workshop

The Deep Learning with PyTorch Workshop

By : Hyatt Saleh
5 (3)
close
close
The Deep Learning with PyTorch Workshop

The Deep Learning with PyTorch Workshop

5 (3)
By: Hyatt Saleh

Overview of this book

Want to get to grips with one of the most popular machine learning libraries for deep learning? The Deep Learning with PyTorch Workshop will help you do just that, jumpstarting your knowledge of using PyTorch for deep learning even if you’re starting from scratch. It’s no surprise that deep learning’s popularity has risen steeply in the past few years, thanks to intelligent applications such as self-driving vehicles, chatbots, and voice-activated assistants that are making our lives easier. This book will take you inside the world of deep learning, where you’ll use PyTorch to understand the complexity of neural network architectures. The Deep Learning with PyTorch Workshop starts with an introduction to deep learning and its applications. You’ll explore the syntax of PyTorch and learn how to define a network architecture and train a model. Next, you’ll learn about three main neural network architectures - convolutional, artificial, and recurrent - and even solve real-world data problems using these networks. Later chapters will show you how to create a style transfer model to develop a new image from two images, before finally taking you through how RNNs store memory to solve key data issues. By the end of this book, you’ll have mastered the essential concepts, tools, and libraries of PyTorch to develop your own deep neural networks and intelligent apps.
Table of Contents (8 chapters)
close
close

Batch Normalization

It is typical to normalize the input layer in an attempt to speed up learning, as well as to improve performance by rescaling all the features to the same scale. So, the question is, if the model benefits from the normalization of the input layer, why not normalize the output of all the hidden layers in an attempt to improve the training speed even more?

Batch normalization, as its name suggests, normalizes the outputs from the hidden layers so that it reduces the variance from each layer, which is also known as covariance shift. This reduction of the covariance shift is useful as it allows the model to also work well on images that follow a different distribution than the images used to train it.

Take, for instance, a network that has the purpose of detecting whether an animal is a cat. When the network is trained only using images of black cats, batch normalization can help the network also classify new images of cats of different colors by normalizing...

CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
The Deep Learning with PyTorch Workshop
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon