Company
Date Published
Author
Isaac Harris
Word count
1516
Language
English
Hacker News points
None

Summary

AWS Lambda has evolved significantly since its introduction in 2014, expanding from basic cloud functions to a robust serverless platform. Despite its growth, Lambda's utility for data-intensive or latency-sensitive applications was hindered by a 6 MiB payload limit and the requirement for fully buffered responses before delivery. To address these limitations, AWS introduced Lambda Response Streaming, which allows partial responses to be sent to clients as they are ready, thereby reducing time to first byte (TTFB) and supporting larger payloads up to a soft limit of 20 MiB with a throughput of 2 MiB/s. This feature necessitates a new Lambda function handler signature that facilitates immediate data streaming to the client, supports NodeJS v14.x, v16.x, and v18.x, and includes an updated billing model based on bytes processed and streamed. While streamed responses are not compatible with API Gateway or Application Load Balancers, they can be accessed through Lambda Function URLs or the InvokeWithResponseStream API. AWS provides tools like Pulumi for packaging and deploying Lambda functions with streaming capabilities, thereby enhancing Lambda's functionality for modern microservices architectures.