Content Deep Dive
OpenAI Codex with Ollama
Blog post from Ollama
Post Details
Source URL
Summary
OpenAI's Codex CLI can be utilized with open models through Ollama, allowing users to read, modify, and execute code within their working directory using models like gpt-oss:20b or gpt-oss:120b. To start, users should install the Codex CLI via npm and initiate it with the --oss flag, which defaults to using the local gpt-oss:20b model. Codex requires a substantial context window, with a recommendation of at least 32K tokens, and users can adjust the context length as needed. Model switching is facilitated by using the -m flag, and all models on Ollama Cloud are compatible with Codex. Additional setup instructions and configuration options can be found in the Codex integration guide.