Book Image

Scalable Data Streaming with Amazon Kinesis

By : Tarik Makota, Brian Maguire, Danny Gagne, Rajeev Chakrabarti
Book Image

Scalable Data Streaming with Amazon Kinesis

By: Tarik Makota, Brian Maguire, Danny Gagne, Rajeev Chakrabarti

Overview of this book

Amazon Kinesis is a collection of secure, serverless, durable, and highly available purpose-built data streaming services. This data streaming service provides APIs and client SDKs that enable you to produce and consume data at scale. Scalable Data Streaming with Amazon Kinesis begins with a quick overview of the core concepts of data streams, along with the essentials of the AWS Kinesis landscape. You'll then explore the requirements of the use case shown through the book to help you get started and cover the key pain points encountered in the data stream life cycle. As you advance, you'll get to grips with the architectural components of Kinesis, understand how they are configured to build data pipelines, and delve into the applications that connect to them for consumption and processing. You'll also build a Kinesis data pipeline from scratch and learn how to implement and apply practical solutions. Moving on, you'll learn how to configure Kinesis on a cloud platform. Finally, you’ll learn how other AWS services can be integrated into Kinesis. These services include Redshift, Dynamo Database, AWS S3, Elastic Search, and third-party applications such as Splunk. By the end of this AWS book, you’ll be able to build and deploy your own Kinesis data pipelines with Kinesis Data Streams (KDS), Kinesis Data Firehose (KFH), Kinesis Video Streams (KVS), and Kinesis Data Analytics (KDA).
Table of Contents (13 chapters)
1
Section 1: Introduction to Data Streaming and Amazon Kinesis
5
Section 2: Deep Dive into Kinesis
10
Section 3: Integrations

Using data transformation in KDF with a Lambda function

KDF provides the ability to transform ingested records inline through integration with the AWS Lambda service, which allows KDF to invoke a Lambda function (called a Lambda transform) to do custom processing as long as the code adheres to a data transformation and status model. By default, data transformation is disabled and needs to be enabled in the delivery stream configuration. The following diagram illustrates how data transformation works in KDF:

Figure 5.3 – Data transformation with Lambda invocations

Once enabled, the incoming records are buffered up to 3 megabytes (MB) by default. The buffering size can be adjusted using the ProcessingConfiguration API using the BufferSizeInMBs processor parameter (the AWS Console only supports the BufferSizeInMBs and BufferIntervalInSeconds parameters) available as textboxes and lets you choose from a drop-down list of Lambda functions available in the...