v0.1.12 released — memory optimizations, OpenAI model discovery   See changelog →

One coding
agent. Any LLM.

Full agentic workflow — bash, files, grep, glob, agents, MCP, web search — running on Anthropic, OpenAI, Gemini, Ollama, or any of 200+ models. One CLI, zero vendor lock-in.

$ npm install -g @claudiolabs/claudio@latest
$ bun install -g @claudiolabs/claudio@latest
$ npx @claudiolabs/claudio@latest

Every major LLM. One CLI.

Configure any provider interactively with /provider — no environment variables, no config files, no restarts.

Anthropic
anthropic
API key / OAuth
OpenAI
openai
API key
Gemini
gemini
API key
DeepSeek
deepseek
API key
Mistral
mistral
API key
GitHub Copilot
copilot
GitHub OAuth
Codex / ChatGPT
codex
OAuth
Ollama
ollama
Local · Free
AWS Bedrock
bedrock
AWS credentials
Google Vertex
vertex
ADC
Azure Foundry
foundry
DefaultAzureCredential
OpenRouter
openrouter
API key
Groq
groq
API key
LM Studio
lmstudio
Local · Free
Together AI
together
API key
Any OpenAI-compat
custom
Base URL + key

Built different

No env vars needed

Configure providers and models entirely from inside the app with /provider. Switch live. Run /provider doctor to health-check your active profile.

Web search on any model

DuckDuckGo web search ships built-in for all providers — not just Anthropic. Optional Firecrawl integration for JS-rendered pages. No extra setup required.

Memory system

Persistent memory across sessions — private and team-scoped. Automatic context pruning, token budget management, and stub eviction keep costs low as projects grow.

Full agentic toolkit

Bash execution, file tools, grep, glob, sub-agents, tasks, MCP servers, LSP diagnostics, Jupyter notebooks, cron scheduling, plan mode, git worktrees — all in one CLI.

gRPC headless server

Embed the full agentic engine into CI/CD pipelines or custom UIs via gRPC bidirectional streaming. Generate clients in Python, Go, Rust, or any language from the included .proto file.

VS Code extension included

Provider-aware control-center UI, launch integration, and theme support — no separate install. MCP built-in. Skills and plugins system for extensible slash commands.

Zero friction migration

Coming from Claude Code?

Claudio originated from the Claude Code codebase and has been substantially modified to support any provider. It automatically detects your existing ~/.claude/ and migrates tokens, settings, theme, plugins, and keybindings.

Independent project. Not affiliated with, endorsed by, or sponsored by Anthropic.

Migration guide →
# install claudio
npm install -g @claudiolabs/claudio@latest
 
# run — migration is automatic
claudio
 
Migrated settings from ~/.claude/
Tokens, theme, plugins restored
Ready — switch provider with /provider
Local · Zero key · Free

Run fully offline

No API key, no cost, no data sent anywhere. Pull a model with Ollama or LM Studio and start coding in under a minute.

Local setup guide →
# pull a model with ollama
ollama pull qwen2.5-coder:7b
 
# launch claudio with it
claudio --model qwen2.5-coder:7b
 
Provider: ollama (local)
No API key required