The latest open source foundation models are being released at an incredible pace, with each model deployable in a couple of clicks from the Baseten Model Library. The Falcon-7B and Falcon-40B models, developed by the Technology Innovation Institute (TII), have been gaining popularity due to their high-quality data and robust performance, making them ideal for text generation and problem-solving applications. MusicGen is another model that's generating a lot of buzz, taking text or melodic inputs and producing high-quality music samples. WizardLM introduces Evol-Instruct, a new training dataset created by combining human-generated instructions with LLaMA. MPT-7B Base, a seven billion parameter model trained in under 10 days, offers an affordable alternative to large language models while still matching the quality of LLaMA-7B. These models can be deployed on Baseten in just a few clicks, making it easy for developers to integrate them into their projects.