Book Image

Building Data Science Applications with FastAPI - Second Edition

By : François Voron
3 (1)
Book Image

Building Data Science Applications with FastAPI - Second Edition

3 (1)
By: François Voron

Overview of this book

Building Data Science Applications with FastAPI is the go-to resource for creating efficient and dependable data science API backends. This second edition incorporates the latest Python and FastAPI advancements, along with two new AI projects – a real-time object detection system and a text-to-image generation platform using Stable Diffusion. The book starts with the basics of FastAPI and modern Python programming. You'll grasp FastAPI's robust dependency injection system, which facilitates seamless database communication, authentication implementation, and ML model integration. As you progress, you'll learn testing and deployment best practices, guaranteeing high-quality, resilient applications. Throughout the book, you'll build data science applications using FastAPI with the help of projects covering common AI use cases, such as object detection and text-to-image generation. These hands-on experiences will deepen your understanding of using FastAPI in real-world scenarios. By the end of this book, you'll be well equipped to maintain, design, and monitor applications to meet the highest programming standards using FastAPI, empowering you to create fast and reliable data science API backends with ease while keeping up with the latest advancements.
Table of Contents (21 chapters)
1
Part 1: Introduction to Python and FastAPI
7
Part 2: Building and Deploying a Complete Web Backend with FastAPI
13
Part 3: Building Resilient and Distributed Data Science Systems with FastAPI

Technical requirements

For this chapter, you’ll require a Python virtual environment, just as we set up in Chapter 1, Python Development Environment Setup.

To run the Stable Diffusion model correctly, we recommend you have a recent computer equipped with at least 16 GB of RAM and, ideally, a dedicated GPU with 8 GB of VRAM. For Mac users, recent models equipped with the M1 Pro or M2 Pro chips are also a good fit. If you don’t have that kind of machine, don’t worry: we’ll show you ways to run the system anyway – the only drawback is that image generation will be slow and show poor results.

For running the worker, you’ll need a running Redis server on your local computer. The easiest way is to run it as a Docker container. If you’ve never used Docker before, we recommend you read the Getting started tutorial in the official documentation at https://docs.docker.com/get-started/. Once done, you’ll be able to run a Redis server...