Book Image

Kafka Streams with Spring Cloud Stream [Video]

By : Prashant Kumar Pandey
5 (1)
Book Image

Kafka Streams with Spring Cloud Stream [Video]

5 (1)
By: Prashant Kumar Pandey

Overview of this book

Kafka Streams with Spring Cloud Streams will help you understand stream processing in general and apply it to Kafka Streams Programming using Spring Boot. This course uses the Kafka Streams library compatible with Spring Cloud 2020. All the source code and examples used in this course have been tested by the author on Confluent Platform 6.0.0, which is compatible with Apache Kafka 2.6 open-source distribution. This is a fully example-driven course, and you will be working with multiple examples during the entire session. We will be making extensive use of IntelliJ IDEA as the preferred development IDE and Apache Maven and Gradle as the preferred build tool. However, based on your prior experience, you should be able to work with any other IDE designed for Spring application development and any other build tool designed for Java applications. This course also makes use of Log4J2 to teach you industry-standard log implementation in your application. We will be using JUnit5, which is the latest version of JUnit, to implement unit test cases. Working examples and exercises are the most critical tool to sharpen your skills. This course consists of some programming assignments as and when appropriate. These exercises will help you validate and check your concepts and apply your learning to solve programming problems. The code bundles for this course is available in https://github.com/PacktPublishing/Kafka-Streams-with-Spring-Cloud-Stream
Table of Contents (12 chapters)
12
Keep Learning
Chapter 6
Processing Kafka Streams
Content Locked
Section 5
Understanding Record Serialization
In this video, you will get a complete overview of the record serialization in Spring Cloud and Kafka Streams. This discussion will help you handle various serialization requirements in your projects. Stream processing is all about data processing in real time. These data events will come to a Kafka topic as a continuous stream of messages. These messages will flow over the wire to Kafka and fit into the Kafka topic. When you create a listener to read these messages from the Kafka topic, they will again flow over the wire and come to your listener. These messages will travel as a Byte Array.