Book Image

Python for Geeks

By : Muhammad Asif
Book Image

Python for Geeks

By: Muhammad Asif

Overview of this book

Python is a multipurpose language that can be used for multiple use cases. Python for Geeks will teach you how to advance in your career with the help of expert tips and tricks. You'll start by exploring the different ways of using Python optimally, both from the design and implementation point of view. Next, you'll understand the life cycle of a large-scale Python project. As you advance, you'll focus on different ways of creating an elegant design by modularizing a Python project and learn best practices and design patterns for using Python. You'll also discover how to scale out Python beyond a single thread and how to implement multiprocessing and multithreading in Python. In addition to this, you'll understand how you can not only use Python to deploy on a single machine but also use clusters in private as well as in public cloud computing environments. You'll then explore data processing techniques, focus on reusable, scalable data pipelines, and learn how to use these advanced techniques for network automation, serverless functions, and machine learning. Finally, you'll focus on strategizing web development design using the techniques and best practices covered in the book. By the end of this Python book, you'll be able to do some serious Python programming for large-scale complex projects.
Table of Contents (20 chapters)
1
Section 1: Python, beyond the Basics
5
Section 2: Advanced Programming Concepts
9
Section 3: Scaling beyond a Single Thread
13
Section 4: Using Python for Web, Cloud, and Network Use Cases

Using PySpark for parallel data processing

As discussed previously, Apache Spark is written in Scala language, which means there is no native support for Python. There is a large community of data scientists and analytics experts who prefer to use Python for data processing because of the rich set of libraries available with Python. Hence, it is not convenient to switch to using another programming language only for distributed data processing. Thus, integrating Python with Apache Spark is not only beneficial for the data science community but also opens the doors for many others who would like to adopt Apache Spark without learning or switching to a new programming language.

The Apache Spark community has built a Python library, PySpark, to facilitate working with Apache Spark using Python. To make the Python code work with Apache Spark, which is built on Scala (and Java), a Java library, Py4J, has been developed. This Py4J library is bundled with PySpark and allows the Python...