Book Image

JavaScript Concurrency

By : Adam Boduch
Book Image

JavaScript Concurrency

By: Adam Boduch

Overview of this book

Concurrent programming may sound abstract and complex, but it helps to deliver a better user experience. With single threaded JavaScript, applications lack dynamism. This means that when JavaScript code is running, nothing else can happen. The DOM can’t update, which means the UI freezes. In a world where users expect speed and responsiveness – in all senses of the word – this is something no developer can afford. Fortunately, JavaScript has evolved to adopt concurrent capabilities – one of the reasons why it is still at the forefront of modern web development. This book helps you dive into concurrent JavaScript, and demonstrates how to apply its core principles and key techniques and tools to a range of complex development challenges. Built around the three core principles of concurrency – parallelism, synchronization, and conservation – you’ll learn everything you need to unlock a more efficient and dynamic JavaScript, to lay the foundations of even better user experiences. Throughout the book you’ll learn how to put these principles into action by using a range of development approaches. Covering everything from JavaScript promises, web workers, generators and functional programming techniques, everything you learn will have a real impact on the performance of your applications. You’ll also learn how to move between client and server, for a more frictionless and fully realized approach to development. With further guidance on concurrent programming with Node.js, JavaScript Concurrency is committed to making you a better web developer. The best developers know that great design is about more than the UI – with concurrency, you can be confident every your project will be expertly designed to guarantee its dynamism and power.
Table of Contents (17 chapters)
JavaScript Concurrency
Credits
About the Author
About the Reviewer
www.PacktPub.com
Preface
Index

Do we need to go parallel?


Parallelism can be hugely beneficial to us for the right sort of problems. Creating workers and synchronizing the communication between them to carry out tasks isn't free. For example, we could have this nice, well thought-out parallel code that utilizes four CPU cores. But it turns out that the time spent executing the boilerplate code to facilitate this parallelism exceeds the cost of simply processing the data in a single thread.

In this section, we'll address the issues associated with validating the data that we're processing and determining the hardware capabilities of the system. We'll always want to have a synchronous fallback option for the scenarios where parallel execution simply doesn't make sense. When we decide to go parallel, our next job is to figure out exactly how the work gets distributed to workers. All of these checks are performed at runtime.

How big is the data?

Sometimes, going parallel just isn't worthwhile. The idea with parallelism is to...