Company
Date Published
Author
Amy Boyle
Word count
1877
Language
English
Hacker News points
None

Summary

New Relic has built a scalable Kafka pipeline to process event data from microservices. The pipeline uses Kafka topics to move data between services, allowing for decoupling and scalability. It includes a chain of stream processing services that operate on event data in series, with each service producing messages onto a topic for the next service to consume. This allows for efficient processing of fine-grained monitoring data. New Relic also uses Kafka as a changelog, storing and reloading state using durable caching. The system is designed to be high-throughput and real-time, with features such as log compaction and manual de-duplication. To achieve concurrency, the team uses the disruptor pattern, which allows for efficient parallelization of processing data from multiple partitions. As the business scales, New Relic continues to refine its Kafka strategies to increase scalability, reduce dependencies, and improve code maintainability.