To build a high performance data processing system, it is essential to account for the size and flow of data, identifying bottlenecks in computational tasks and focusing on optimizing these areas. Characterizing the system using key factors such as working set size, average transaction size, request size, update rate, consistency, locality, computation, and latency can help pinpoint dominant operations responsible for data congestion. By understanding these characteristics and creatively adjusting constraints to see how they impact the system, developers can identify fundamental bottlenecks and develop strategies to address them, ultimately building a low-latency system that scales over time.