Book Image

Python for Geeks

By : Muhammad Asif
Book Image

Python for Geeks

By: Muhammad Asif

Overview of this book

Python is a multipurpose language that can be used for multiple use cases. Python for Geeks will teach you how to advance in your career with the help of expert tips and tricks. You'll start by exploring the different ways of using Python optimally, both from the design and implementation point of view. Next, you'll understand the life cycle of a large-scale Python project. As you advance, you'll focus on different ways of creating an elegant design by modularizing a Python project and learn best practices and design patterns for using Python. You'll also discover how to scale out Python beyond a single thread and how to implement multiprocessing and multithreading in Python. In addition to this, you'll understand how you can not only use Python to deploy on a single machine but also use clusters in private as well as in public cloud computing environments. You'll then explore data processing techniques, focus on reusable, scalable data pipelines, and learn how to use these advanced techniques for network automation, serverless functions, and machine learning. Finally, you'll focus on strategizing web development design using the techniques and best practices covered in the book. By the end of this Python book, you'll be able to do some serious Python programming for large-scale complex projects.
Table of Contents (20 chapters)
1
Section 1: Python, beyond the Basics
5
Section 2: Advanced Programming Concepts
9
Section 3: Scaling beyond a Single Thread
13
Section 4: Using Python for Web, Cloud, and Network Use Cases

Learning about the cluster options for parallel processing

When we have a large volume of data to process, it is not efficient and sometimes even not feasible to use a single machine with multiple cores to process the data efficiently. This is especially a challenge when working with real-time streaming data. For such scenarios, we need multiple systems that can process data in a distributed manner and perform these tasks on multiple machines in parallel. Using multiple machines to process compute-intensive tasks in parallel and in a distributed manner is called cluster computing. There are several big data distributed frameworks available to coordinate the execution of jobs in a cluster, but Hadoop MapReduce and Apache Spark are the leading contenders in this race. Both frameworks are open source projects from Apache. There are many variants (for example, Databricks) of these two platforms available with add-on features as well as maintenance support, but the fundamentals remain...