Note
Designed by humans, developed by Gemini 3, not widely tested.
OpenProxy is a lightweight, zero-config gateway designed for Claude Code and Gemini CLI, enabling them to use OpenAI-Compatible APIs.
- Dual Client Support: One service supports both Claude Code and Gemini CLI.
- Zero Server-Side Config: No server-side configuration required; all settings are controlled by the original client.
- Multi-User Support: Stateless design allows multiple users to use their own API Keys.
- Worker Deployment: Supports deployment to Cloudflare Workers and Vercel.
Docker:
docker run -d -p 3000:3000 ttttmr/openproxyLocal:
npm install
npm run startCloudflare Workers:
Vercel:
Point your client to the proxy. The target OpenAI Base URL is embedded directly in the path.
Format: http://<proxy-host>/<target-openai-base-url>
Example configuration:
# openai
export OPENAI_BASE_URL="https://openrouter.ai/api/v1"
export OPENAI_API_KEY="sk-..."
export OPENAI_MODEL="gpt-5"
# claude code
export ANTHROPIC_BASE_URL="https://openproxy.xlab.app/$OPENAI_BASE_URL"
export ANTHROPIC_AUTH_TOKEN="sk-..."
export ANTHROPIC_MODEL="gpt-5"
export ANTHROPIC_DEFAULT_OPUS_MODEL=$ANTHROPIC_MODEL
export ANTHROPIC_DEFAULT_SONNET_MODEL=$ANTHROPIC_MODEL
export ANTHROPIC_DEFAULT_HAIKU_MODEL=$ANTHROPIC_MODEL
export CLAUDE_CODE_SUBAGENT_MODEL=$ANTHROPIC_MODEL
# gemini cli
export GOOGLE_GEMINI_BASE_URL="https://openproxy.xlab.app/$OPENAI_BASE_URL"
export GEMINI_API_KEY="sk-..."
export GEMINI_MODEL="gpt-5"