This latest development allows developers to access Llama 2 in the Cloudflare Workers platform, enabling them to build LLM-augmented experiences directly into their applications with reduced latency and improved personalization. To deploy AI in the application, developers must tackle three new challenges: operating closer to real-time, providing customer and domain context, and "closing the loop" to understand how AI experiences impact the overall customer journey. By combining Segment's Edge SDK with Cloudflare Workers and Llama 2, developers can build high-context, low-latency LLM app experiences with minimal code footprint. This enables personalized experiences, such as customized welcome messages, live updates while customers are in-store, and targeted promotions, ultimately leading to improved customer engagement at scale.