Book Image

Node.js Design Patterns - Third Edition

By : Mario Casciaro, Luciano Mammino
5 (1)
Book Image

Node.js Design Patterns - Third Edition

5 (1)
By: Mario Casciaro, Luciano Mammino

Overview of this book

In this book, we will show you how to implement a series of best practices and design patterns to help you create efficient and robust Node.js applications with ease. We kick off by exploring the basics of Node.js, analyzing its asynchronous event driven architecture and its fundamental design patterns. We then show you how to build asynchronous control flow patterns with callbacks, promises and async/await. Next, we dive into Node.js streams, unveiling their power and showing you how to use them at their full capacity. Following streams is an analysis of different creational, structural, and behavioral design patterns that take full advantage of JavaScript and Node.js. Lastly, the book dives into more advanced concepts such as Universal JavaScript, scalability and messaging patterns to help you build enterprise-grade distributed applications. Throughout the book, you’ll see Node.js in action with the help of several real-life examples leveraging technologies such as LevelDB, Redis, RabbitMQ, ZeroMQ, and many others. They will be used to demonstrate a pattern or technique, but they will also give you a great introduction to the Node.js ecosystem and its set of solutions.
Table of Contents (16 chapters)
14
Other Books You May Enjoy
15
Index

Asynchronous request batching and caching

In high-load applications, caching plays a critical role and it's used almost everywhere on the web, from static resources such as web pages, images, and stylesheets, to pure data such as the result of database queries. In this section, we are going to learn how caching applies to asynchronous operations and how a high request throughput can be turned to our advantage.

What's asynchronous request batching?

When dealing with asynchronous operations, the most basic level of caching can be achieved by batching together a set of invocations to the same API. The idea is very simple: if we invoke an asynchronous function while there is still another one pending, we can piggyback on the already running operation instead of creating a brand new request. Take a look at the following diagram:

Figure 11.1: Two asynchronous requests with no batching

The previous diagram shows two clients...