AWS Lambda and Google Cloud Functions are currently the leaders in terms of performance, with AWS Lambda consistently outperforming its competitors across all languages. However, Cloudflare Workers provides a lower cold start latency than regular FaaS providers thanks to executing functions on the V8 engine straight away, which is comparable to running JavaScript directly on Node.js. EdgeEngine and Fly.io also run functions close to the caller to reduce latency. The performance differences between these edge providers are significant, with Cloudflare being several times faster than its competitors. Other providers like Azure Functions and IBM Cloud Functions exhibit high worst-case latencies, while Vercel and Netlify rely on AWS Lambda for their performance. Do-it-yourself FaaS frameworks struggle to compete with true serverless providers in terms of cold start latency, scaling, and distribution. The main four FaaS providers (Azure, AWS, Google, IBM) are comparable in pricing, but the new kids on the block, such as Cloudflare Workers and EdgeEngine, offer unique features and pricing models. Most engineers prefer programming languages like JavaScript, Go, Python, Ruby, Java, C#, PowerShell, PHP, Swift, Bash, and Rust, with some providers supporting WebAssembly. The memory limits range from 128 MB to 3,008 MB, while execution time limits vary between 2 minutes to no limit for CPU-bound tasks. Payload request and response sizes also differ significantly among providers. Overall, the choice of FaaS provider depends on factors such as performance requirements, ease of use, pricing, language support, and specific use cases like edge computing or do-it-yourself solutions.