Company
Date Published
Author
Together AI
Word count
1448
Language
English
Hacker News points
None

Summary

Arcee AI has simplified AI adoption by creating efficient, smaller language models that help enterprises integrate advanced AI workflows. The company transitioned its specialized small language models (SLMs) from AWS to Together Dedicated Endpoints, unlocking significant improvements in cost, performance, and operational agility. Arcee AI's focus on training SLMs optimized for specific tasks has produced high-performing models, including seven models available on Together AI serverless endpoints. The company's software layer, Arcee Conductor, uses a unique 150 million parameter classifier to intelligently route queries to the most suitable model, reducing latency and costs. Arcee Orchestra enables enterprises to automate tasks through seamless integration with third-party services and data sources, while simplifying infrastructure management and reducing costs. By migrating to Together Dedicated Endpoints, Arcee AI simplified its infrastructure and achieved performance improvements, including reduced latency and increased throughput. The company remains committed to continuously optimizing its GPU infrastructure, enabling effortless scaling on Together Dedicated Endpoints with superior performance, flexibility, and cost-efficiency.