Clarifai 12.0: Introducing Pipelines for Long-Running AI Workflows
Blog post from Clarifai
Clarifai 12.0 introduces significant advancements in AI workflow orchestration, notably through the introduction of Pipelines, enabling the management of long-running, multi-step AI tasks directly on the platform. This new feature allows users to define and oversee complex workflows with containerized steps that run asynchronously, offering fine-grained control over execution order, parallelism, and data flow. The release also includes enhancements in model routing, facilitating deployments across multiple nodepools, thus ensuring high availability and scalable operations without manual failover management. Additionally, agentic capabilities are expanded with Model Context Protocol (MCP) support, allowing models to interact with tool servers during inference. Clarifai now offers a Pay-As-You-Go billing plan for flexible and predictable usage, and new reasoning models from the Ministral 3 family are introduced to enhance inference capabilities. The platform's usability is further improved with updates across the Python SDK and CLI, focusing on stability and developer experience.