Book Image

Advanced Python Programming - Second Edition

By : Quan Nguyen
Book Image

Advanced Python Programming - Second Edition

By: Quan Nguyen

Overview of this book

Python's powerful capabilities for implementing robust and efficient programs make it one of the most sought-after programming languages. In this book, you'll explore the tools that allow you to improve performance and take your Python programs to the next level. This book starts by examining the built-in as well as external libraries that streamline tasks in the development cycle, such as benchmarking, profiling, and optimizing. You'll then get to grips with using specialized tools such as dedicated libraries and compilers to increase your performance at number-crunching tasks, including training machine learning models. The book covers concurrency, a major solution to making programs more efficient and scalable, and various concurrent programming techniques such as multithreading, multiprocessing, and asynchronous programming. You'll also understand the common problems that cause undesirable behavior in concurrent programs. Finally, you'll work with a wide range of design patterns, including creational, structural, and behavioral patterns that enable you to tackle complex design and architecture challenges, making your programs more robust and maintainable. By the end of the book, you'll be exposed to a wide range of advanced functionalities in Python and be equipped with the practical knowledge needed to apply them to your use cases.
Table of Contents (32 chapters)
1
Section 1: Python-Native and Specialized Optimization
8
Section 2: Concurrency and Parallelism
18
Section 3: Design Patterns in Python

Summary

JAX is a Python- and NumPy-friendly library that offers high-performance tools that are specific to machine learning tasks. JAX centers its API around function transformations, allowing users, in one line of code, to pass in generic Python functions and receive transformed versions of the functions that would otherwise either be expensive to compute or require more advanced implementations. The syntax of function transformations also enables flexible and complex compositions of functions, which are common in machine learning.

Throughout this chapter, we have seen how to utilize JAX to compute the gradient of machine learning loss functions using automatic differentiation, JIT-compile our code for further optimization, and vectorize kernel functions via a binary classification example. However, these tasks are present in most use cases, and you will be able to seamlessly apply what we have discussed here to your own machine learning needs.

At this point, we have reached...