Using any model via LiteLLM
Note
The LiteLLM integration is in beta. You may run into issues with some model providers, especially smaller ones. Please report any issues viaGithub issues and we'll fix quickly.
LiteLLM is a library that allows you to use 100+ models via a single interface. We've added a LiteLLM integration to allow you to use any AI model in the Agents SDK.
Setup
You'll need to ensurelitellm is available. You can do this by installing the optionallitellm dependency group:
Once done, you can useLitellmModel in any agent.
Example
This is a fully working example. When you run it, you'll be prompted for a model name and API key. For example, you could enter:
openai/gpt-4.1for the model, and your OpenAI API keyanthropic/claude-3-5-sonnet-20240620for the model, and your Anthropic API key- etc
For a full list of models supported in LiteLLM, see thelitellm providers docs.
from__future__importannotationsimportasynciofromagentsimportAgent,Runner,function_tool,set_tracing_disabledfromagents.extensions.models.litellm_modelimportLitellmModel@function_tooldefget_weather(city:str):print(f"[debug] getting weather for{city}")returnf"The weather in{city} is sunny."asyncdefmain(model:str,api_key:str):agent=Agent(name="Assistant",instructions="You only respond in haikus.",model=LitellmModel(model=model,api_key=api_key),tools=[get_weather],)result=awaitRunner.run(agent,"What's the weather in Tokyo?")print(result.final_output)if__name__=="__main__":# First try to get model/api key from argsimportargparseparser=argparse.ArgumentParser()parser.add_argument("--model",type=str,required=False)parser.add_argument("--api-key",type=str,required=False)args=parser.parse_args()model=args.modelifnotmodel:model=input("Enter a model name for Litellm: ")api_key=args.api_keyifnotapi_key:api_key=input("Enter an API key for Litellm: ")asyncio.run(main(model,api_key))Tracking usage data
If you want LiteLLM responses to populate the Agents SDK usage metrics, passModelSettings(include_usage=True) when creating your agent.
fromagentsimportAgent,ModelSettingsfromagents.extensions.models.litellm_modelimportLitellmModelagent=Agent(name="Assistant",model=LitellmModel(model="your/model",api_key="..."),model_settings=ModelSettings(include_usage=True),)Withinclude_usage=True, LiteLLM requests report token and request counts throughresult.context_wrapper.usage just like the built-in OpenAI models.