What Is OpenCode?
OpenCode is a Go-based CLI application that brings AI assistance directly into your terminal. Built by the open-source community and designed around the philosophy that your development environment should stay as it is, OpenCode works in any shell, with any model, and requires no IDE plugin to function.
Unlike browser-based AI coding tools that lock you into a specific interface, OpenCode meets you where you already work — the terminal. It provides an interactive TUI (Terminal User Interface) built with Bubble Tea, the same framework that powers the Charm toolchain, giving it a polished, keyboard-driven experience that experienced developers actually want to use.
OpenCode has over 95,000 stars on GitHub and supports more than 75 different LLM providers, making it one of the most provider-agnostic AI coding tools available. If an LLM has an API, OpenCode can probably use it.
Key Features
Provider Agnosticism
OpenCode supports an unusually broad range of AI providers out of the box: OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Azure OpenAI, Groq, and OpenRouter — plus any model accessible via a local endpoint. This means you are not locked into any single provider's pricing or availability. You can switch models mid-session depending on the task.
Auto Compact: No More Context Window Panic
One of the most practical features in OpenCode is its auto compact feature. As your conversation grows and approaches the model's context window limit, OpenCode automatically summarises the conversation history and creates a new session with the condensed summary — preserving the important context without losing it. This is genuinely useful for long debugging sessions or large refactors.
Session Management with SQLite
All conversations and sessions are stored in a local SQLite database, giving you persistent session history that you can search, resume, or audit later. Unlike browser-based tools that lose context when you close a tab, OpenCode sessions survive across restarts and machine reboots.
Tool Integration
OpenCode can execute shell commands, search and read files, and modify code directly during its reasoning. The bash tool integrates with your default shell (respecting the SHELL environment variable), and you can configure a specific shell path and arguments in the config file if needed.
LSP Support
Language Server Protocol support is built in, meaning OpenCode automatically loads the right LSPs for the language you are working in. This gives it accurate code intelligence without requiring a separate editor plugin.
Provider Configuration
OpenCode reads configuration from ~/.opencode.json, $XDG_CONFIG_HOME/opencode/.opencode.json, or a local .opencode.json in the current directory. You configure AI providers using environment variables — ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY, and so on — which keeps credentials out of config files.
{
"autoCompact": true,
"providers": {
"openaicli": {
"apiKey": "env:OPENAI_API_KEY"
}
}
}
How to Install
curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | bash
brew install opencode-ai/tap/opencode
go install github.com/opencode-ai/opencode@latest
The Competition
In the AI coding agent space, OpenCode differentiates primarily on cost and openness. Claude Code and GitHub Copilot are tied to their respective parent companies' models and ecosystems. OpenCode's provider-agnostic approach means you can use the most cost-effective model for a given task — Gemini 2.5 Pro for a fast explanation, Opus 4 for a complex architecture decision, a local Ollama model for anything sensitive.
Feature details sourced from the official OpenCode GitHub repository. [Source] [Source]Bottom line: OpenCode is the most flexible free AI coding agent available. If you want to stay in your terminal, avoid subscription lock-in, and keep full control over which models you use, it is worth spending an evening setting up.



