Book Image

Scalable Data Streaming with Amazon Kinesis

By : Tarik Makota, Brian Maguire, Danny Gagne, Rajeev Chakrabarti
Book Image

Scalable Data Streaming with Amazon Kinesis

By: Tarik Makota, Brian Maguire, Danny Gagne, Rajeev Chakrabarti

Overview of this book

Amazon Kinesis is a collection of secure, serverless, durable, and highly available purpose-built data streaming services. This data streaming service provides APIs and client SDKs that enable you to produce and consume data at scale. Scalable Data Streaming with Amazon Kinesis begins with a quick overview of the core concepts of data streams, along with the essentials of the AWS Kinesis landscape. You'll then explore the requirements of the use case shown through the book to help you get started and cover the key pain points encountered in the data stream life cycle. As you advance, you'll get to grips with the architectural components of Kinesis, understand how they are configured to build data pipelines, and delve into the applications that connect to them for consumption and processing. You'll also build a Kinesis data pipeline from scratch and learn how to implement and apply practical solutions. Moving on, you'll learn how to configure Kinesis on a cloud platform. Finally, you’ll learn how other AWS services can be integrated into Kinesis. These services include Redshift, Dynamo Database, AWS S3, Elastic Search, and third-party applications such as Splunk. By the end of this AWS book, you’ll be able to build and deploy your own Kinesis data pipelines with Kinesis Data Streams (KDS), Kinesis Data Firehose (KFH), Kinesis Video Streams (KVS), and Kinesis Data Analytics (KDA).
Table of Contents (13 chapters)
1
Section 1: Introduction to Data Streaming and Amazon Kinesis
5
Section 2: Deep Dive into Kinesis
10
Section 3: Integrations

Understanding encryption in KDF

KDF supports both encryption in transit and encryption at rest. KDF has a REST API that supports secure HTTP (that is, HTTPS). For encryption at rest, the method employed depends on the data ingestion mechanism. As explained in the Understanding KDF delivery streams section, there are two ways to ingest data into KDF: Direct PUT and a KDS stream as a source. In addition, KDF has integrations with a number of other AWS services, such as Amazon CloudWatch Logs, Amazon CloudWatch Events, AWS Internet of Things (IoT), or Amazon Simple Notification Service (SNS), which allows those services to send data to KDF.

For Direct PUT using either PutRecord or PutRecordBatch APIs and for other AWS services sending data to KDF, you can enable encryption at rest (or server-side encryption) using an AWS Key Management Service (KMS) customer master key (CMK). The CMK can be either an AWS-owned CMK or a customer-managed CMK. AWS-owned CMKs are not in your account....