New in Arize: Realtime Trace Ingestion, Prompt Playground Upgrades & More
Blog post from Arize
In May, Arize introduced significant updates, including the expansion of realtime trace ingestion to all Arize AX tiers, enhancing the ability to monitor large language model (LLM) performance live without requiring configuration changes. This feature, previously limited to enterprise users and the open-source platform Phoenix, is now accessible to all users, including those on the free tier. The enhancements also include major usability upgrades to the prompt playground and span views, such as improved latency tracking, token counts, and a more streamlined user interface for debugging. Additionally, support for more OpenAI models in the prompt playground and tasks, along with attribute-level filtering, facilitates quicker experimentation and trace analysis. These improvements aim to streamline workflows by providing a sleeker display of inputs and outputs, enabling faster debugging and more efficient management of prompt variables and dataset values.