Event-Driven and Streaming Architectures
Evolving to an event streaming architecture: The patterns and strategies to liberate your data
For more information about this training course, please free to contact :
During this instructor-led three-day hands-on you will learn the different patterns and strategies needed to implement a modern streaming event-driven architecture.
This course enables participants to acquire the following skills:
- Understand the different event-driven architectures.
- Understanding the characteristics and benefits of event-driven architectures.
- Implementing different data integration and inter-application communication patterns.
- Modeling events and an event system.
- Understand and use the different data streaming solutions.
60% theory, 40% practice
Who Should Attend ?
This workshop is designed for Application Developers, Architects and Data Engineers.
Participants should be familiar with Java development. No previous knowledge of Apache Kafka or Apache Pulsar is required.
Module 1: Introduction to Event-driven Architectures
- Event-first Application, The motivations
- Event-driven Architectures
- Event-driven, Microservices
- Event-driven, Streaming applications
- The main characteristics
- Benefits and disadvantages
- The concepts of Microservices
Module 2: The fundamentals of an Event-driven Architecture
- The concept of Event Streams
- Event-driven microservices
- Event types and Event structures
- The principle of “Single Writer”
Module 3: Event Brokers
- The different models: queuing and Publish/Subscribe
- Introduction to Apache Kafka
- Introduction to Apache Pulsar
- Comparison of the two solutions
Module 4: Data schema management
- The notion of “Event-driven Contract”
- The event formats (Avro, Protobuf)
- Strategies to evolve schemas
- Modeling an Event, The best practices
Module 5: Integration patterns
- Building a migration strategy: Unidirectional Event-driven Architecture
- Data sourcing or Change Data Capture
- The “Query-based” pattern
- The “Log-based” pattern
- Limitations: Data consistency and schema dependencies
- Implementation of the Anti-Corruption Layer pattern
- Implementation of the Outbox Tables pattern
- The data consumption
- Introduction to Debezium
- The Frameworks Connect (Kafka and Pulsar)
- Impact on the organization: Dependencies and team responsibilities
Module 6: Event-driven, Streaming services
- Partitioned Event Streams
- Consumer Groups
- Assignment strategies and collocation of data
- Co-location of processing: Shuffling vs Re-partitioning
- Frameworks: Flink & Kafka Streams
Module 7: Event-driven, Microservices
- Communication patterns
- Interaction mechanisms
- Event Notification
- Event Carried State Transfer
- Event Chain
- Synchronous commands
- The Event-sourcing model
- What is Event-Sourcing?
- Event Store: Concepts and characteristics
- Command Query Responsibility Segregation (CQRS)
- The challenges
Module 8: Implementing Event-driven workflows
- Patterns to build processing lines
- Distributed transactions
- Compensation mechanisms
Module 9: Event reprocessing and deterministic behavior
- The motivations for data reprocessing
- Idempotence and Processing Semantics
- The notion of time in an event stream
- How to manage retroactive event ?
- The challenges
- How to manage interactions with external systems ?
- Understanding the impacts of Query and Command
- Temporal Properties
- The Gateways
- Evolution of the source code
- Understanding the impacts
- Implement the pattern: Agreement Dispatcher
Module 10: Materializing event streams
- The different patterns (External, Internal, Global, Shared)
Module 11: Streaming SQL Databases
- What is a Streaming database ?
- The different solutions
- Confluent ksqlDB
Module 12: Modeling an Event System
- The different approaches
- The Event Modeling approach
Module 13: Standards and Specifications
- Traceability Open-tracing
- CloudEvents (CNCF)