Kafka Streams - Advanced skills for developing stream processing applications
Price : 1950 € HT - per attendee
For more information about this training course, please free to contact :
During this instructor-led three-day hands-on we will learn how to use advanced Kafka Streams API and discuss best-practices for both for development and production.
You will learn how to use the Kafka Streams DSL API and the low-level Processor API to build stream processing topologies. In addition, you will learn how to use internal stores to implement stateful applications.
50% theory, 50% practise
Who Should Attend ?
This course is design for applications Developers, Architects, Data engineers who need to build and deploy streaming applications for enrich, transform, join and query data flowing through Apache Kafka in real-time.
Attendees should be familiar with developing in Java. Attendees should also be familiar with the core concepts of Apache Kafka.
1 ) Go back to basics
The Components of a Kafka cluster
- Broker, Producer, Consumer
- Message, Topic , Partitions
- OS Page-cache
Scalability inside consumer groups
Replication and Fault-Tolerance
- The roles of brokers (Leader, Follower, Controller)
- In-Sync Replicas
- Producer and Message Delivery Reliability
The retention policies (deleted, compacted)
2 ) Introduction to Kafka Streams
Why Kafka Streams ?
Use-Cases & Key Features
Levels of abstraction
Concepts of Streams and Table
Building a simple topology using DSL API
The underlying Processors API
Sub-topologies and Repartitioning topics
3 ) Architecture and Threading Model
Consumption and processing model
4 ) The Processor API
Accessing Processor Context
Forwarding records to the downstream processors
Building a topology
Manipulating local states
Punctuate operations (Punctuator API)
5 ) The Streams DSL API and Stateful operations
The API Overview
The API Abstractions & Operations
- KStream / KGroupedStream
- KTable /KGroupedTable