A lightweight proxy that translates Anthropic API requests into OpenAI‑compatible format. OpenAI, Gemini, Ollama — one command, zero friction.
How it works
Five steps, zero configuration changes to Claude Code. The proxy handles everything silently.
Claude Code sends requests to the proxy instead of api.anthropic.com
Maps Anthropic messages, tools, and system prompts to OpenAI schema
Routes the translated request to your chosen provider
Transforms provider SSE chunks into Anthropic format on-the-fly
Pipes the response back to Claude Code — fully transparent
Providers
Switch between providers with a single flag. Each one is purpose-built for its ecosystem.
gpt-5.2-pro and gpt-5.2 with full streaming and tool use support.
gpt-5.2-progpt-5.2Gemini 2.5 Pro and Flash with automatic top_p handling.
gemini-3.1-progemini-3-flashRun fully local models with complete privacy. No API key required.
llama3.3llama3.2Quick start
Configuration
Set via environment variables, CLI flags, or ~/.ccproxy/config.json
| Variable | Description | Default |
|---|---|---|
| PREFERRED_PROVIDER | openai | gemini | ollama | openai |
| OPENAI_API_KEY | OpenAI API key | — |
| GEMINI_API_KEY | Google Gemini API key | — |
| OLLAMA_API_BASE | Ollama server URL | localhost:11434 |
| BIG_MODEL | Model for sonnet / opus requests | provider default |
| SMALL_MODEL | Model for haiku requests | provider default |
| PORT | Server port | 8082 |