Together AI has launched two products, Together API and Together Compute, to help developers build with open-source AI models in a cost-effective and efficient manner. These cloud services offer a full stack solution for training, fine-tuning, and running leading open-source AI models, including over 50 models currently hosted by Together AI. The goal is to make AI accessible by reducing costs, which can be significant due to the large number of parameters in generative AI models. Together API provides an easy-to-use fine-tuning API with optimized private endpoints for low-latency inference, while Together Compute offers clusters of high-end GPUs paired with a distributed training stack at a cost-effective price point. This move is seen as part of the "Linux moment" for AI, where open-source solutions are becoming increasingly popular, and Together AI aims to be part of this movement by providing a platform for developers to build and release models, datasets, and research in the open.