Book Image

Advanced Python Programming

By : Dr. Gabriele Lanaro, Quan Nguyen, Sakis Kasampalis
Book Image

Advanced Python Programming

By: Dr. Gabriele Lanaro, Quan Nguyen, Sakis Kasampalis

Overview of this book

This Learning Path shows you how to leverage the power of both native and third-party Python libraries for building robust and responsive applications. You will learn about profilers and reactive programming, concurrency and parallelism, as well as tools for making your apps quick and efficient. You will discover how to write code for parallel architectures using TensorFlow and Theano, and use a cluster of computers for large-scale computations using technologies such as Dask and PySpark. With the knowledge of how Python design patterns work, you will be able to clone objects, secure interfaces, dynamically choose algorithms, and accomplish much more in high performance computing. By the end of this Learning Path, you will have the skills and confidence to build engaging models that quickly offer efficient solutions to your problems. This Learning Path includes content from the following Packt products: • Python High Performance - Second Edition by Gabriele Lanaro • Mastering Concurrency in Python by Quan Nguyen • Mastering Python Design Patterns by Sakis Kasampalis
Table of Contents (41 chapters)
Title Page
Copyright
About Packt
Contributors
Preface
Index

Client-side communication with aiohttp


In previous sections, we covered examples of implementing asynchronous communication channels with the asyncio module, mostly from the perspective of the server side of the communication process. In other words, we have been considering handling and processing requests sent from external systems. This, however, is only one side of the equation, and we also have the client side of communication to explore. In this section, we will discuss applying asynchronous programming to make requests to servers.

As you have most likely guessed, the end goal of this process is to efficiently collect data from external systems by asynchronously making requests to those systems. We will be revisiting the concept of web scraping, which is the process of automating HTTP requests to various websites and extracting specific information from their HTML source code. If you have not read Chapter 12, Concurrent Web Requests, I highly recommend going through it before proceeding...