Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying The Deep Learning Workshop
  • Table Of Contents Toc
The Deep Learning Workshop

The Deep Learning Workshop

By : Mirza Rahim Baig , Thomas Joseph, Nipun Sadvilkar , Mohan Kumar Silaparasetty , Anthony So , Akshay Chauhan, Nagendra Nagaraj, Robert Ridley
4.5 (4)
close
close
The Deep Learning Workshop

The Deep Learning Workshop

4.5 (4)
By: Mirza Rahim Baig , Thomas Joseph, Nipun Sadvilkar , Mohan Kumar Silaparasetty , Anthony So , Akshay Chauhan, Nagendra Nagaraj, Robert Ridley

Overview of this book

Are you fascinated by how deep learning powers intelligent applications such as self-driving cars, virtual assistants, facial recognition devices, and chatbots to process data and solve complex problems? Whether you are familiar with machine learning or are new to this domain, The Deep Learning Workshop will make it easy for you to understand deep learning with the help of interesting examples and exercises throughout. The book starts by highlighting the relationship between deep learning, machine learning, and artificial intelligence and helps you get comfortable with the TensorFlow 2.0 programming structure using hands-on exercises. You’ll understand neural networks, the structure of a perceptron, and how to use TensorFlow to create and train models. The book will then let you explore the fundamentals of computer vision by performing image recognition exercises with convolutional neural networks (CNNs) using Keras. As you advance, you’ll be able to make your model more powerful by implementing text embedding and sequencing the data using popular deep learning solutions. Finally, you’ll get to grips with bidirectional recurrent neural networks (RNNs) and build generative adversarial networks (GANs) for image synthesis. By the end of this deep learning book, you’ll have learned the skills essential for building deep learning models with TensorFlow and Keras.
Table of Contents (9 chapters)
close
close
Preface

Transfer Learning

So far, we've learned a lot about designing and training our own CNN models. But as you may have noticed, some of our models are not performing very well. This can be due to multiple reasons, such as the dataset being too small or our model requiring more training.

But training a CNN takes a lot of time. It would be great if we could reuse an existing architecture that has already been trained. Luckily for us, such an option does exist, and it is called transfer learning. TensorFlow provides different implementations of state-of-the-art models that have been trained on the ImageNet dataset (over 14 million images).

Note

You can find the list of available pretrained models in the TensorFlow documentation: https://www.tensorflow.org/api_docs/python/tf/keras/applications

To use a pretrained model, we need to import its implemented class. Here, we will be importing a VGG16 model:

import tensorflow as tf
from tensorflow.keras.applications import VGG16...
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
The Deep Learning Workshop
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon