Cohere Labs Launches Tiny Aya, Making Multilingual AI Accessible
Blog post from Cohere
Tiny Aya is an innovative AI model designed to enhance multilingual capabilities, focusing on underrepresented languages and linguistic diversity while operating efficiently on local hardware. It provides high-quality translation and AI education tools in areas with limited cloud infrastructure, ensuring stable performance across diverse languages, including those with minimal web presence. Developed from extensive research by the Aya initiative, Tiny Aya is built with strategies that maintain diversity during training and enable efficient adaptation for new domains and languages. The model family includes regionally specialized variants—TinyAya-Earth, TinyAya-Fire, and TinyAya-Water—tailored to linguistic communities in Africa, South Asia, and the Asia-Pacific, respectively, while maintaining broad multilingual coverage. Emphasizing accessibility, Tiny Aya features efficient tokenization across languages, reducing memory and compute requirements, thus making multilingual AI more practical for local use. Released as open weight models, Tiny Aya invites researchers to further develop and adapt the models for specific linguistic ecosystems, encouraging a diverse, community-driven future for multilingual AI.