Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Run Claude Code on OpenAI models

NotificationsYou must be signed in to change notification settings

githubnext/claude-code-proxy

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Use Anthropic clients (like Claude Code) with Gemini or OpenAI backends. 🤝

A proxy server that lets you use Anthropic clients with Gemini or OpenAI models via LiteLLM. 🌉

Anthropic API Proxy

Quick Start ⚡

Prerequisites

  • OpenAI API key 🔑
  • Google AI Studio (Gemini) API key (if using Google provider) 🔑
  • uv installed.

Setup 🛠️

  1. Clone this repository:

    git clone https://github.com/1rgs/claude-code-openai.gitcd claude-code-openai
  2. Install uv (if you haven't already):

    curl -LsSf https://astral.sh/uv/install.sh| sh

    (uv will handle dependencies based onpyproject.toml when you run the server)

  3. Configure Environment Variables:Copy the example environment file:

    cp .env.example .env

    Edit.env and fill in your API keys and model configurations:

    • ANTHROPIC_API_KEY: (Optional) Needed only if proxyingto Anthropic models.
    • OPENAI_API_KEY: Your OpenAI API key (Required if using the default OpenAI preference or as fallback).
    • GEMINI_API_KEY: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google).
    • PREFERRED_PROVIDER (Optional): Set toopenai (default) orgoogle. This determines the primary backend for mappinghaiku/sonnet.
    • BIG_MODEL (Optional): The model to mapsonnet requests to. Defaults togpt-4.1 (ifPREFERRED_PROVIDER=openai) orgemini-2.5-pro-preview-03-25.
    • SMALL_MODEL (Optional): The model to maphaiku requests to. Defaults togpt-4.1-mini (ifPREFERRED_PROVIDER=openai) orgemini-2.0-flash.

    Mapping Logic:

    • IfPREFERRED_PROVIDER=openai (default),haiku/sonnet map toSMALL_MODEL/BIG_MODEL prefixed withopenai/.
    • IfPREFERRED_PROVIDER=google,haiku/sonnet map toSMALL_MODEL/BIG_MODEL prefixed withgemini/if those models are in the server's knownGEMINI_MODELS list (otherwise falls back to OpenAI mapping).
  4. Run the server:

    uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload

    (--reload is optional, for development)

Using with Claude Code 🎮

  1. Install Claude Code (if you haven't already):

    npm install -g @anthropic-ai/claude-code
  2. Connect to your proxy:

    ANTHROPIC_BASE_URL=http://localhost:8082 claude
  3. That's it! Your Claude Code client will now use the configured backend models (defaulting to Gemini) through the proxy. 🎯

Model Mapping 🗺️

The proxy automatically maps Claude models to either OpenAI or Gemini models based on the configured model:

Claude ModelDefault MappingWhen BIG_MODEL/SMALL_MODEL is a Gemini model
haikuopenai/gpt-4o-minigemini/[model-name]
sonnetopenai/gpt-4ogemini/[model-name]

Supported Models

OpenAI Models

The following OpenAI models are supported with automaticopenai/ prefix handling:

  • o3-mini
  • o1
  • o1-mini
  • o1-pro
  • gpt-4.5-preview
  • gpt-4o
  • gpt-4o-audio-preview
  • chatgpt-4o-latest
  • gpt-4o-mini
  • gpt-4o-mini-audio-preview
  • gpt-4.1
  • gpt-4.1-mini

Gemini Models

The following Gemini models are supported with automaticgemini/ prefix handling:

  • gemini-2.5-pro-preview-03-25
  • gemini-2.0-flash

Model Prefix Handling

The proxy automatically adds the appropriate prefix to model names:

  • OpenAI models get theopenai/ prefix
  • Gemini models get thegemini/ prefix
  • The BIG_MODEL and SMALL_MODEL will get the appropriate prefix based on whether they're in the OpenAI or Gemini model lists

For example:

  • gpt-4o becomesopenai/gpt-4o
  • gemini-2.5-pro-preview-03-25 becomesgemini/gemini-2.5-pro-preview-03-25
  • When BIG_MODEL is set to a Gemini model, Claude Sonnet will map togemini/[model-name]

Customizing Model Mapping

Control the mapping using environment variables in your.env file or directly:

Example 1: Default (Use OpenAI)No changes needed in.env beyond API keys, or ensure:

OPENAI_API_KEY="your-openai-key"GEMINI_API_KEY="your-google-key" # Needed if PREFERRED_PROVIDER=google# PREFERRED_PROVIDER="openai" # Optional, it's the default# BIG_MODEL="gpt-4.1" # Optional, it's the default# SMALL_MODEL="gpt-4.1-mini" # Optional, it's the default

Example 2: Prefer Google

GEMINI_API_KEY="your-google-key"OPENAI_API_KEY="your-openai-key" # Needed for fallbackPREFERRED_PROVIDER="google"# BIG_MODEL="gemini-2.5-pro-preview-03-25" # Optional, it's the default for Google pref# SMALL_MODEL="gemini-2.0-flash" # Optional, it's the default for Google pref

Example 3: Use Specific OpenAI Models

OPENAI_API_KEY="your-openai-key"GEMINI_API_KEY="your-google-key"PREFERRED_PROVIDER="openai"BIG_MODEL="gpt-4o" # Example specific modelSMALL_MODEL="gpt-4o-mini" # Example specific model

How It Works 🧩

This proxy works by:

  1. Receiving requests in Anthropic's API format 📥
  2. Translating the requests to OpenAI format via LiteLLM 🔄
  3. Sending the translated request to OpenAI 📤
  4. Converting the response back to Anthropic format 🔄
  5. Returning the formatted response to the client ✅

The proxy handles both streaming and non-streaming responses, maintaining compatibility with all Claude clients. 🌊

Contributing 🤝

Contributions are welcome! Please feel free to submit a Pull Request. 🎁

About

Run Claude Code on OpenAI models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python100.0%

[8]ページ先頭

©2009-2025 Movatter.jp