Book Image

Architecting Data-Intensive Applications

By : Anuj Kumar
Book Image

Architecting Data-Intensive Applications

By: Anuj Kumar

Overview of this book

<p>Are you an architect or a developer who looks at your own applications gingerly while browsing through Facebook and applauding it silently for its data-intensive, yet ?uent and efficient, behaviour? This book is your gateway to build smart data-intensive systems by incorporating the core data-intensive architectural principles, patterns, and techniques directly into your application architecture.</p> <p>This book starts by taking you through the primary design challenges involved with architecting data-intensive applications. You will learn how to implement data curation and data dissemination, depending on the volume of your data. You will then implement your application architecture one step at a time. You will get to grips with implementing the correct message delivery protocols and creating a data layer that doesn’t fail when running high traffic. This book will show you how you can divide your application into layers, each of which adheres to the single responsibility principle. By the end of this book, you will learn to streamline your thoughts and make the right choice in terms of technologies and architectural principles based on the problem at hand.</p>
Table of Contents (18 chapters)
Title Page
Packt Upsell
Contributors
Preface
Index

Event-Data Pipelines


An event-data pipeline is a much more important component in a data-intensive application than a query-data pipeline. The event-data pipeline needs to handle a large volume of event data being produced by many sources of data in a reliable, efficient, and fault-tolerant manner. The event-data pipeline follows a publish-subscribe principle, as shown in the following diagram:

A client publishes a message to a central queuing server and other clients who wish to receive the messages subscribe with the central queuing server for the message to be delivered to them.

Whenever you try to define an event-data pipeline, keep in mind the three V's of the data:

  • Volume: What is the average and peak volume of the data that we expect?
  • Variety: What different variety of data we expect–documents, events, size, and so on.
  • Velocity: What is the average and peak velocity that we can expect the data to arrive?

Let's go through the process of architecting an event-data pipeline.

 

 

There are many...