Home / Companies / Confluent / Blog / Post Details
Content Deep Dive

Using Confluent Platform to Complete a Massive Cloud Provider Migration and Handle Half a Million Events Per Second

Blog post from Confluent

Post Details
Company
Date Published
Author
Oguz Kayral, Gil Friedlis, Matt Mangia
Word Count
1,070
Language
English
Hacker News Points
-
Summary

Unity's data team, despite being small, is responsible for managing the data infrastructure that supports both its development platform and monetization network, which includes handling half a million events per second and millions of dollars in transactions without outages since implementing Confluent Platform and Apache Kafka®. Initially, Unity faced challenges integrating various departmental data pipelines across different technology stacks, but successfully unified them with Kafka, particularly during a significant cloud migration from AWS to GCP. Supported by Confluent, Unity transitioned from a batch processing to an event streaming model, drastically reducing data latency from two days to 15 minutes, which has facilitated numerous business improvements and enabled real-time decision-making. This transformation has led to widespread adoption of event streaming systems across Unity and interest in using Kafka for real-time machine learning model training, emphasizing the platform's reliability and the shift in internal perceptions towards more stable and efficient data handling practices.