Company
Date Published
Author
Vrushank Vyas
Word count
355
Language
English
Hacker News points
None

Summary

Portkey has made significant advancements in May by introducing several features aimed at enhancing user control and integration with large language models (LLMs). Users can now manage organization and LLM spending through budget limits and rate controls, create API keys with specific permissions, and invite team members with tailored access levels. New integrations include Day 0 compatibility with major AI models from OpenAI and Google, as well as partnerships with providers like ZhipuAI, Predibase, and MonsterAPI. Portkey has also expanded its functionality by integrating with the Instructor library for structured output extraction and deepening its integration with Promptfoo to run evaluations across 200+ LLMs. Additional features include cache namespace simplification, Gemini function calling support, latency comparison tools, and the ability to deploy Portkey on one's own cloud infrastructure. Engaging with the community, Portkey's CTO participated in a Reddit AMA, and users are encouraged to join the Discord community for updates.