Book Image

Distributed Computing with Python

Book Image

Distributed Computing with Python

Overview of this book

CPU-intensive data processing tasks have become crucial considering the complexity of the various big data applications that are used today. Reducing the CPU utilization per process is very important to improve the overall speed of applications. This book will teach you how to perform parallel execution of computations by distributing them across multiple processors in a single machine, thus improving the overall performance of a big data processing task. We will cover synchronous and asynchronous models, shared memory and file systems, communication between various processes, synchronization, and more.
Table of Contents (15 chapters)
Distributed Computing with Python
Credits
About the Author
About the Reviewer
www.PacktPub.com
Preface
Index

Multiprocess queues


When using multiple processes, the issue that comes up is how to exchange data between the workers. The multiprocessing module offers a mechanism to do that in the form of queues and pipes. Hence, we are going to look at multiprocess queues.

The multiprocessing.Queue class is modeled after the queue.Queue class with the additional twist that items stored in the multiprocessing queue need to be pickable. To illustrate how to use these queues, create a new Python script (queues.py) with the following code:

import multiprocessing as mp


def fib(n):
    if n <= 2:
        return 1
    elif n == 0:
        return 0
    elif n < 0:
        raise Exception('fib(n) is undefined for n < 0')
    return fib(n - 1) + fib(n - 2)


def worker(inq, outq):
    while True:
        data = inq.get()
        if data is None:
            return
        fn, arg = data
        outq.put(fn(arg))


if __name__ == '__main__':
    import argparse

    parser = argparse.ArgumentParser(...