Open Source · MIT Licensed

Use Claude Code with any AI provider.

A lightweight proxy that translates Anthropic API requests into OpenAI‑compatible format. OpenAI, Gemini, Ollama — one command, zero friction.

$curl -fsSL https://ccproxy.himanshuat.com/install.sh | bash
Terminal
# Start Claude Code with the proxy
$ ccproxy
# Start the proxy server standalone
$ ccproxy server
# View your current configuration
$ ccproxy config

Transparent by design.

Five steps, zero configuration changes to Claude Code. The proxy handles everything silently.

CCProxy Architecture Diagram
01

Intercept

Claude Code sends requests to the proxy instead of api.anthropic.com

🔄
02

Translate

Maps Anthropic messages, tools, and system prompts to OpenAI schema

📡
03

Forward

Routes the translated request to your chosen provider

04

Stream

Transforms provider SSE chunks into Anthropic format on-the-fly

05

Deliver

Pipes the response back to Claude Code — fully transparent

Your model, your rules.

Switch between providers with a single flag. Each one is purpose-built for its ecosystem.

OpenAI

gpt-5.2-pro and gpt-5.2 with full streaming and tool use support.

Big modelgpt-5.2-pro
Small modelgpt-5.2

Google Gemini

Gemini 2.5 Pro and Flash with automatic top_p handling.

Big modelgemini-3.1-pro
Small modelgemini-3-flash

Ollama

Run fully local models with complete privacy. No API key required.

Big modelllama3.3
Small modelllama3.2

Up and running in seconds.

# Run the automated installer (macOS/Linux)
$curl -fsSL https://ccproxy.himanshuat.com/install.sh | bash
# Or install globally via npm
$pnpm install -g claudebox
# Interactively configure models & providers
$ccproxy config
# Start server standalone
$ccproxy server
# Start Claude Code (auto-launches proxy)
$ccproxy

Simple, flexible config.

Set via environment variables, CLI flags, or ~/.ccproxy/config.json

VariableDescriptionDefault
PREFERRED_PROVIDERopenai | gemini | ollamaopenai
OPENAI_API_KEYOpenAI API key
GEMINI_API_KEYGoogle Gemini API key
OLLAMA_API_BASEOllama server URLlocalhost:11434
BIG_MODELModel for sonnet / opus requestsprovider default
SMALL_MODELModel for haiku requestsprovider default
PORTServer port8082