Home / Companies / Honeycomb / Blog / Post Details
Content Deep Dive

Getting Started With Refinery: Rules File Template

Blog post from Honeycomb

Post Details
Company
Date Published
Author
Max Aguirre
Word Count
1,200
Language
English
Hacker News Points
-
Summary

Sampling is essential for managing large-scale applications, and Honeycomb recommends using their Refinery tool to effectively sample data. The rules file provided is an example template for setting up Refinery, consisting of seven rules designed to minimize event volume by dropping uninteresting data while retaining rare or significant data such as errors and anomalies. The rules employ three types of samplers: the Rules-Based Sampler, the EMA Dynamic Sampler, and the EMA Throughput Sampler, which sequentially assess traces to determine the appropriate sampling action. The strategy begins with specific rules for keeping and dropping data, proceeds with dynamic sampling for frequently occurring events, and concludes with a catch-all rule to manage remaining traces. The philosophy underpinning these rules is to discard routine data like fast, successful requests and health checks, while preserving data that may indicate issues or require attention. Users are encouraged to regularly review and adjust their sampling rules to accommodate new services and changing data patterns, ensuring optimal sampling performance as their infrastructure evolves.