5 ms Global Redis Latency with Edge Caching
Blog post from Upstash
Edge caching is a technology designed to reduce latency in globally distributed systems by caching REST responses at edge locations worldwide, similar to a CDN, achieving an average global latency of 5ms. It is particularly beneficial for web and mobile applications that can access databases directly without a backend, as well as for multi-region serverless architectures where functions like AWS Lambda can run in different regions to further lower latency. Edge computing, including Cloudflare workers and other edge functions, benefits from Redis with edge caching due to its low latency and lightweight nature. Although edge caching is available only for GET requests and incurs additional costs, it allows for controlling cache response durations with headers and complements global database replication efforts by providing fast data access while retaining consistency.