Datadog's Cloud Cost Management (CCM) and LLM Observability work together to provide granular insights into OpenAI token usage and cost, helping organizations track the total cost of ownership of their generative AI services. CCM allows for breaking down real spend from project or organization level to individual models and token consumption, while LLM Observability provides a consolidated view of operational performance, model quality and safety, and application traces. The OpenAI integration offers three different ways to monitor cost insights, including out-of-the-box metrics via the OpenAI API integration, native Cloud Cost Management integration, and native LLM Observability integration. Datadog's CCM stores pricing information for OpenAI models, providing accurate and up-to-date information about spend, while LLM Observability enables users to investigate root causes of issues, monitor operational performance, and evaluate quality, privacy, and safety of LLM applications.