LangSmith now supports ingesting traces in the OpenTelemetry format, an open standard for distributed tracing and observability, enabling developers to instrument and export telemetry data across various programming languages and tools. This update allows LangSmith's API layer to accept OpenTelemetry traces directly, providing a comprehensive view of application performance with unified LLM monitoring and system telemetry. OpenTelemetry defines semantic conventions for attribute names and data across different use cases, and LangSmith specifically supports conventions for generative AI. The platform now accommodates traces in the OpenLLMetry format, which offers out-of-the-box instrumentation for multiple LLM models and frameworks. Users can configure their OpenTelemetry-compatible SDKs to point to LangSmith's OTEL endpoint to ingest traces, and support for other semantic conventions will be added as they develop. Integration examples are provided for OpenTelemetry Python client, OpenLLMetry SDK from Traceloop, and Vercel AI SDK, each offering a method to track and visualize traces in the LangSmith dashboard.