Local-First AI with Gemma 4 + Ollama + TanStack AI: A YouTube Knowledge Base
Blog post from Strapi
YT Knowledge Base is a local-first application designed to transform YouTube videos into structured knowledge resources by using AI technology. The app ingests YouTube URLs, processes video transcripts with a local Ollama model to create structured summaries, and employs a chat interface that leverages BM25 retrieval for answering questions not covered by the transcripts. The front-end stack includes TanStack Start, React 19, and Tailwind v4, while the back-end utilizes Strapi 5 and SQLite, with Ollama running a local Gemma 4 model for language model calls. The app features a two-pronged approach to summarization, using a single-pass method for short videos and a map-reduce pipeline for longer ones, with a focus on maintaining accuracy by avoiding model-generated timecodes and instead using BM25 for deterministic grounding. It highlights the feasibility of implementing advanced AI functionalities on local hardware without relying on hosted APIs, allowing users to efficiently evaluate video content and retain summaries for future reference.