Book Image

Learning Apache Flink

By : Tanmay Deshpande
Book Image

Learning Apache Flink

By: Tanmay Deshpande

Overview of this book

<p>With the advent of massive computer systems, organizations in different domains generate large amounts of data on a real-time basis. The latest entrant to big data processing, Apache Flink, is designed to process continuous streams of data at a lightning fast pace.</p> <p>This book will be your definitive guide to batch and stream data processing with Apache Flink. The book begins with introducing the Apache Flink ecosystem, setting it up and using the DataSet and DataStream API for processing batch and streaming datasets. Bringing the power of SQL to Flink, this book will then explore the Table API for querying and manipulating data. In the latter half of the book, readers will get to learn the remaining ecosystem of Apache Flink to achieve complex tasks such as event processing, machine learning, and graph processing. The final part of the book would consist of topics such as scaling Flink solutions, performance optimization and integrating Flink with other tools such as ElasticSearch.</p> <p>Whether you want to dive deeper into Apache Flink, or want to investigate how to get more out of this powerful technology, you’ll find everything you need inside.</p>
Table of Contents (17 chapters)
Learning Apache Flink
Credits
About the Author
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface

Pattern API


The Pattern API allows you to define complex event patterns very easily. Each pattern consists of multiple states. To go from one state to another state, generally we need to define the conditions. The conditions could be continuity or filtered out events.

Let's try to understand each pattern operation in detail.

Begin

The initial state can be defined as follows:

In Java:

Pattern<Event, ?> start = Pattern.<Event>begin("start"); 

In Scala:

val start : Pattern[Event, _] = Pattern.begin("start") 

Filter

We can also specify the filter condition for the initial state:

In Java:

start.where(new FilterFunction<Event>() { 
    @Override 
    public boolean filter(Event value) { 
        return ... // condition 
    } 
}); 

In Scala:

start.where(event => ... /* condition */) 

Subtype

We can also filter out events based on their sub-types, using the subtype() method:

In Java:

start.subtype(SubEvent.class).where(new FilterFunction<SubEvent...