Claude Max API Proxy
Claude Max API Proxy
Section titled “Claude Max API Proxy”claude-max-api-proxy is a community tool that exposes your Claude Max/Pro subscription as an OpenAI-compatible API endpoint. This allows you to use your subscription with any tool that supports the OpenAI API format.
Why Use This?
Section titled “Why Use This?”| Approach | Cost | Best For |
|---|---|---|
| Anthropic API | Pay per token (~$15/M input, $75/M output for Opus) | Production apps, high volume |
| Claude Max subscription | $200/month flat | Personal use, development, unlimited usage |
If you have a Claude Max subscription and want to use it with OpenAI-compatible tools, this proxy can save you significant money.
How It Works
Section titled “How It Works”Your App → claude-max-api-proxy → Claude Code CLI → Anthropic (via subscription) (OpenAI format) (converts format) (uses your login)The proxy:
- Accepts OpenAI-format requests at
http://localhost:3456/v1/chat/completions - Converts them to Claude Code CLI commands
- Returns responses in OpenAI format (streaming supported)
Installation
Section titled “Installation”# Requires Node.js 20+ and Claude Code CLInpm install -g claude-max-api-proxy
# Verify Claude CLI is authenticatedclaude --versionStart the server
Section titled “Start the server”claude-max-api# Server runs at http://localhost:3456Test it
Section titled “Test it”# Health checkcurl http://localhost:3456/health
# List modelscurl http://localhost:3456/v1/models
# Chat completioncurl http://localhost:3456/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "claude-opus-4", "messages": [{"role": "user", "content": "Hello!"}] }'With OpenClaw
Section titled “With OpenClaw”You can point OpenClaw at the proxy as a custom OpenAI-compatible endpoint:
{ env: { OPENAI_API_KEY: "not-needed", OPENAI_BASE_URL: "http://localhost:3456/v1", }, agents: { defaults: { model: { primary: "openai/claude-opus-4" }, }, },}Available Models
Section titled “Available Models”| Model ID | Maps To |
|---|---|
claude-opus-4 | Claude Opus 4 |
claude-sonnet-4 | Claude Sonnet 4 |
claude-haiku-4 | Claude Haiku 4 |
Auto-Start on macOS
Section titled “Auto-Start on macOS”Create a LaunchAgent to run the proxy automatically:
cat > ~/Library/LaunchAgents/com.claude-max-api.plist << 'EOF'<?xml version="1.0" encoding="UTF-8"?><!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"><plist version="1.0"><dict> <key>Label</key> <string>com.claude-max-api</string> <key>RunAtLoad</key> <true/> <key>KeepAlive</key> <true/> <key>ProgramArguments</key> <array> <string>/usr/local/bin/node</string> <string>/usr/local/lib/node_modules/claude-max-api-proxy/dist/server/standalone.js</string> </array> <key>EnvironmentVariables</key> <dict> <key>PATH</key> <string>/usr/local/bin:/opt/homebrew/bin:~/.local/bin:/usr/bin:/bin</string> </dict></dict></plist>EOF
launchctl bootstrap gui/$(id -u) ~/Library/LaunchAgents/com.claude-max-api.plist- npm: https://www.npmjs.com/package/claude-max-api-proxy
- GitHub: https://github.com/atalovesyou/claude-max-api-proxy
- Issues: https://github.com/atalovesyou/claude-max-api-proxy/issues
- This is a community tool, not officially supported by Anthropic or OpenClaw
- Requires an active Claude Max/Pro subscription with Claude Code CLI authenticated
- The proxy runs locally and does not send data to any third-party servers
- Streaming responses are fully supported
See Also
Section titled “See Also”- Anthropic provider - Native OpenClaw integration with Claude setup-token or API keys
- OpenAI provider - For OpenAI/Codex subscriptions