Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Using any model via LiteLLM

Note

The LiteLLM integration is in beta. You may run into issues with some model providers, especially smaller ones. Please report any issues viaGithub issues and we'll fix quickly.

LiteLLM is a library that allows you to use 100+ models via a single interface. We've added a LiteLLM integration to allow you to use any AI model in the Agents SDK.

Setup

You'll need to ensurelitellm is available. You can do this by installing the optionallitellm dependency group:

pipinstall"openai-agents[litellm]"

Once done, you can useLitellmModel in any agent.

Example

This is a fully working example. When you run it, you'll be prompted for a model name and API key. For example, you could enter:

  • openai/gpt-4.1 for the model, and your OpenAI API key
  • anthropic/claude-3-5-sonnet-20240620 for the model, and your Anthropic API key
  • etc

For a full list of models supported in LiteLLM, see thelitellm providers docs.

from__future__importannotationsimportasynciofromagentsimportAgent,Runner,function_tool,set_tracing_disabledfromagents.extensions.models.litellm_modelimportLitellmModel@function_tooldefget_weather(city:str):print(f"[debug] getting weather for{city}")returnf"The weather in{city} is sunny."asyncdefmain(model:str,api_key:str):agent=Agent(name="Assistant",instructions="You only respond in haikus.",model=LitellmModel(model=model,api_key=api_key),tools=[get_weather],)result=awaitRunner.run(agent,"What's the weather in Tokyo?")print(result.final_output)if__name__=="__main__":# First try to get model/api key from argsimportargparseparser=argparse.ArgumentParser()parser.add_argument("--model",type=str,required=False)parser.add_argument("--api-key",type=str,required=False)args=parser.parse_args()model=args.modelifnotmodel:model=input("Enter a model name for Litellm: ")api_key=args.api_keyifnotapi_key:api_key=input("Enter an API key for Litellm: ")asyncio.run(main(model,api_key))

[8]ページ先頭

©2009-2025 Movatter.jp