Book Image

The Deep Learning with Keras Workshop

By : Matthew Moocarme, Mahla Abdolahnejad, Ritesh Bhagwat
1 (1)
Book Image

The Deep Learning with Keras Workshop

1 (1)
By: Matthew Moocarme, Mahla Abdolahnejad, Ritesh Bhagwat

Overview of this book

New experiences can be intimidating, but not this one! This beginner’s guide to deep learning is here to help you explore deep learning from scratch with Keras, and be on your way to training your first ever neural networks. What sets Keras apart from other deep learning frameworks is its simplicity. With over two hundred thousand users, Keras has a stronger adoption in industry and the research community than any other deep learning framework. The Deep Learning with Keras Workshop starts by introducing you to the fundamental concepts of machine learning using the scikit-learn package. After learning how to perform the linear transformations that are necessary for building neural networks, you'll build your first neural network with the Keras library. As you advance, you'll learn how to build multi-layer neural networks and recognize when your model is underfitting or overfitting to the training data. With the help of practical exercises, you’ll learn to use cross-validation techniques to evaluate your models and then choose the optimal hyperparameters to fine-tune their performance. Finally, you’ll explore recurrent neural networks and learn how to train them to predict values in sequential data. By the end of this book, you'll have developed the skills you need to confidently train your own neural network models.
Table of Contents (11 chapters)
Preface

Summary

In this chapter, you extended your knowledge of deep learning, from understanding the common representations and terminology to implementing them in practice through exercises and activities. You learned how forward propagation in neural networks works and how it is used for predicting outputs, how the loss function works as a measure of model performance, and how backpropagation is used to compute the derivatives of loss functions with respect to model parameters.

You also learned about gradient descent, which uses the gradients that are computed by backpropagation to gradually update the model parameters. In addition to basic theory and concepts, you implemented and trained both shallow and deep neural networks with Keras and utilized them to make predictions about the output of a given input.

To evaluate your models appropriately, you split a dataset into a training set and a test set as an alternative approach to improving network evaluation and learned the reasons...