Book Image

The Python Workshop - Second Edition

By : Corey Wade, Mario Corchero Jiménez, Andrew Bird, Dr. Lau Cher Han, Graham Lee
4.7 (3)
Book Image

The Python Workshop - Second Edition

4.7 (3)
By: Corey Wade, Mario Corchero Jiménez, Andrew Bird, Dr. Lau Cher Han, Graham Lee

Overview of this book

Python is among the most popular programming languages in the world. It’s ideal for beginners because it’s easy to read and write, and for developers, because it’s widely available with a strong support community, extensive documentation, and phenomenal libraries – both built-in and user-contributed. This project-based course has been designed by a team of expert authors to get you up and running with Python. You’ll work though engaging projects that’ll enable you to leverage your newfound Python skills efficiently in technical jobs, personal projects, and job interviews. The book will help you gain an edge in data science, web development, and software development, preparing you to tackle real-world challenges in Python and pursue advanced topics on your own. Throughout the chapters, each component has been explicitly designed to engage and stimulate different parts of the brain so that you can retain and apply what you learn in the practical context with maximum impact. By completing the course from start to finish, you’ll walk away feeling capable of tackling any real-world Python development problem.
Table of Contents (16 chapters)
13
Chapter 13: The Evolution of Python – Discovering New Python Features

Additional regularization technique – Dropout

Regularization is built into the Early Stopping monitor because a validation test is used during each epoch to score against the training set. The idea is that even if the training set continues to improve, the model will stop building after the validation ceases to improve within the callback patience.

It’s important to examine additional regularization techniques so that you can build even larger neural networks without overfitting the data.

Another very popular regularization technique widely used in neural networks is called the Dropout. Given multiple nodes in multiple layers result in thousands or millions of weights, neural networks can easily overfit the training set.

The idea behind Dropout is to randomly drop some nodes altogether. In densely connected networks, since all nodes in one layer are connected to all nodes in the next layer, any node may be eliminated except the last.

Dropout works in code...