Book Image

Deep Learning with fastai Cookbook

By : Mark Ryan
Book Image

Deep Learning with fastai Cookbook

By: Mark Ryan

Overview of this book

fastai is an easy-to-use deep learning framework built on top of PyTorch that lets you rapidly create complete deep learning solutions with as few as 10 lines of code. Both predominant low-level deep learning frameworks, TensorFlow and PyTorch, require a lot of code, even for straightforward applications. In contrast, fastai handles the messy details for you and lets you focus on applying deep learning to actually solve problems. The book begins by summarizing the value of fastai and showing you how to create a simple 'hello world' deep learning application with fastai. You'll then learn how to use fastai for all four application areas that the framework explicitly supports: tabular data, text data (NLP), recommender systems, and vision data. As you advance, you'll work through a series of practical examples that illustrate how to create real-world applications of each type. Next, you'll learn how to deploy fastai models, including creating a simple web application that predicts what object is depicted in an image. The book wraps up with an overview of the advanced features of fastai. By the end of this fastai book, you'll be able to create your own deep learning applications using fastai. You'll also have learned how to use fastai to prepare raw datasets, explore datasets, train deep learning models, and deploy trained models.
Table of Contents (10 chapters)

Test your knowledge

In this chapter, we have reviewed a broad range of topics, from taking full advantage of the information that fastai provides about models to making your web deployments available to users outside of your local system. In this section, you will get the opportunity to exercise some of the concepts you learned about in this chapter.

Explore the value of repeatable results

In the Using callbacks to get the most out of your training cycle recipe, you made a call to the set_seed() function prior to training each of the models. In that recipe, I stated that these calls were necessary to ensure repeatable results for multiple training cycles. Test out this assertion yourself by following these steps:

  1. First, make a copy of the training_with_tabular_datasets_callbacks.ipynb notebook.
  2. Update your new notebook by commenting out the first call to set_seed() and rerun the whole notebook. What differences do you see in the output of fit_one_cycle() between the...