Reducing MCP token usage by 100x — you don't need code mode
Blog post from Speakeasy
Chase Crumbaugh discusses the advancements in the Model Context Protocol (MCP), particularly focusing on reducing token usage through the introduction of Dynamic Toolsets. Speakeasy's refined Dynamic Toolset implementation combines progressive and semantic search methods to effectively minimize token consumption by 96% for inputs and 90% for total usage, enabling the handling of hundreds of tools without overwhelming language model context windows. This new approach features three core tools—search_tools, describe_tools, and execute_tool—that together optimize token efficiency and maintain a 100% success rate across diverse tasks. Despite requiring more tool calls and slightly increasing execution time, Dynamic Toolsets provide a scalable, cost-effective solution to the challenges of building AI agents capable of interfacing with large toolsets, demonstrating the flexibility and adaptability of MCP in addressing real-world AI system needs.