Since launching its Vector Toolkit a few months ago, Supabase has seen a significant growth in AI applications using OpenAI's text-embedding-ada-002 model for embeddings storage, but this model is not open source and can't be self-hosted. To promote open-source collaboration, Supabase added first-class support for Hugging Face, starting with embeddings, which offers better performance than OpenAI for pgvector use cases. The new Python Vector Client and Edge Functions (Deno/Javascript) support for Hugging Face models enable developers to create adapters that transform input into a new format when upserting and querying, such as chunking large text into smaller chunks or transforming it into embeddings. An example demonstrates how to adapt text input using the ParagraphChunker and TextEmbedding adapter step, creating an embedding from user input and storing it in a database. Supabase Edge Functions now also support running inference workloads in Deno/JavaScript, allowing developers to create API routes that can accept user content, convert it to an embedding, store it in their database, and return the database row as JSON. Additionally, Supabase provides Database Webhooks which can trigger an Edge Function any time a row is inserted, enabling developers to upload plain text to their database and use a background job to convert the text to an embedding. The browser support allows for image search functionality using natural language, while choosing a Hugging Face model involves considering factors such as dimensionality, space requirements, and retrieval speed. Supabase plans to overcome limitations in this initial release by reducing cold starts, handling heavier workloads, and supporting audio and image models in the future.