Главная
Study mode:
on
1
- Intro
2
- The Flink Job
3
- The Entry Point
4
- The Stream Execution Environment
5
- Kafka Configuration
6
- The Data Generator Source
7
- The Data Stream
8
- The Kafka Record Serialization Schema
9
- The Kafka Sink
10
- Putting it Together
11
- Executing the Stream
12
- Compiling and Running
13
- Verifying it Works
14
- Next Steps
Description:
Learn how to produce Apache Kafka messages using Apache Flink and Java in this comprehensive 11-minute video tutorial. Follow along as Wade Waldron guides you through a complete example, covering essential topics such as setting up the Flink job, configuring Kafka, implementing a data generator source, creating a data stream, and utilizing the Kafka sink. Gain hands-on experience in compiling, running, and verifying your Flink application. Perfect for developers looking to enhance their stream processing skills and integrate Apache Flink with Apache Kafka.

Producing Apache Kafka Messages Using Apache Flink and Java

Confluent
Add to list