Home / Companies / Confluent / Blog / Post Details
Content Deep Dive

Getting Your Feet Wet with Stream Processing – Part 1: Tutorial for Developing Streaming Applications

Blog post from Confluent

Post Details
Company
Date Published
Author
Yeva Byzek, Victoria Xia, Wade Waldron
Word Count
1,820
Language
English
Hacker News Points
-
Summary

This webinar will cover how to build RAG-enabled GenAI with Confluent, Flink & MongoDB by leveraging stream processing technology, which is used to collect, store and manage continuous streams of data. Stream processing has numerous use cases and provides benefits such as decoupling dependencies between services, providing pluggability, enabling services to evolve independently, etc. The webinar will introduce a new resource, a free self-paced tutorial for developers who are just getting started with stream processing, which covers the basics of the Kafka Streams API and common patterns to design and build event-driven applications. The tutorial is based on a small microservices ecosystem showcasing an order management workflow, where business events propagate through the ecosystem, triggering services to validate orders in parallel. The system also includes a blocking HTTP GET interface for clients to read their own writes, and other services such as sending emails and collating orders. The tutorial provides exercises for developers to learn patterns for writing solid streaming applications and gain experience with using the Kafka Streams API.