Book Image

Speed Up Your Python with Rust

By : Maxwell Flitton
5 (2)
Book Image

Speed Up Your Python with Rust

5 (2)
By: Maxwell Flitton

Overview of this book

Python has made software development easier, but it falls short in several areas including memory management that lead to poor performance and security. Rust, on the other hand, provides memory safety without using a garbage collector, which means that with its low memory footprint, you can build high-performant and secure apps relatively easily. However, rewriting everything in Rust can be expensive and risky as there might not be package support in Rust for the problem being solved. This is where Python bindings and pip come in. This book will help you, as a Python developer, to start using Rust in your Python projects without having to manage a separate Rust server or application. Seeing as you'll already understand concepts like functions and loops, this book covers the quirks of Rust such as memory management to code Rust in a productive and structured manner. You'll explore the PyO3 crate to fuse Rust code with Python, learn how to package your fused Rust code in a pip package, and then deploy a Python Flask application in Docker that uses a private Rust pip module. Finally, you'll get to grips with advanced Rust binding topics such as inspecting Python objects and modules in Rust. By the end of this Rust book, you'll be able to develop safe and high-performant applications with better concurrency support.
Table of Contents (16 chapters)
1
Section 1: Getting to Understand Rust
5
Section 2: Fusing Rust with Python
11
Section 3: Infusing Rust into a Web Application

Summary

In this chapter, we went through the basics of multiprocessing and multithreading. We then went through practical ways to utilize threads and processes. We then explored the Fibonacci sequence to explore how processes can speed up our computations. We also saw through the Fibonacci sequence that how we compute our problems is the biggest factor over threads and processes. Algorithms that scale exponentially should be avoided before reaching for multiprocessing for speed gains. We must remember that while it might be tempting to reach for more complex approaches to multiprocessing, this can lead to problems such as deadlock and data races. We kept our multiprocessing tight by keeping it contained within a processing pool. If we keep these principles in mind and keep all our multiprocessing contained to a pool, we will keep our hard-to-diagnose problems to a minimum. This does not mean that we should never be creative with multiprocessing but it is advised to do further reading...