Using LLMs to generate real-time data visualizations
Blog post from Tinybird
Tinybird, an analytics backend for software applications, is increasingly being used by developers to monitor Large Language Model (LLM) usage, costs, and performance as AI features become more prevalent. The company has open-sourced the LLM Performance Tracker app template, which includes both a frontend and backend capable of capturing and analyzing LLM calls in real time. A notable feature of this template is the AI Cost Calculator, which allows users to visualize LLM costs by passing user input through an API that generates structured parameters for the Tinybird data API. The system is designed for high scalability and speed, handling millions of LLM call logs efficiently. The app leverages various components like a Tinybird data source and pipe, React components, and an API route to generate structured parameters from user input, ultimately creating dynamic, user-generated data visualizations. While LLMs are utilized to convert free-text inputs into structured data, certain tasks like determining chart types and generating SQL queries are handled without LLMs to maintain performance, security, and observability. The template illustrates how a combination of LLMs and static APIs can facilitate real-time data visualization while ensuring data security and efficiency.