Model interface
ModelTracing
Bases:Enum
Source code insrc/agents/models/interface.py
Model
Bases:ABC
The base interface for calling an LLM.
Source code insrc/agents/models/interface.py
get_responseabstractmethod
async
get_response(system_instructions:str|None,input:str|list[TResponseInputItem],model_settings:ModelSettings,tools:list[Tool],output_schema:AgentOutputSchemaBase|None,handoffs:list[Handoff],tracing:ModelTracing,*,previous_response_id:str|None,prompt:ResponsePromptParam|None,)->ModelResponse
Get a response from the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
system_instructions | str | None | The system instructions to use. | required |
input | str |list[TResponseInputItem] | The input items to the model, in OpenAI Responses format. | required |
model_settings | ModelSettings | The model settings to use. | required |
tools | list[Tool] | The tools available to the model. | required |
output_schema | AgentOutputSchemaBase | None | The output schema to use. | required |
handoffs | list[Handoff] | The handoffs available to the model. | required |
tracing | ModelTracing | Tracing configuration. | required |
previous_response_id | str | None | the ID of the previous response. Generally not used by the model,except for the OpenAI Responses API. | required |
prompt | ResponsePromptParam | None | The prompt config to use for the model. | required |
Returns:
Type | Description |
---|---|
ModelResponse | The full model response. |
Source code insrc/agents/models/interface.py
stream_responseabstractmethod
stream_response(system_instructions:str|None,input:str|list[TResponseInputItem],model_settings:ModelSettings,tools:list[Tool],output_schema:AgentOutputSchemaBase|None,handoffs:list[Handoff],tracing:ModelTracing,*,previous_response_id:str|None,prompt:ResponsePromptParam|None,)->AsyncIterator[TResponseStreamEvent]
Stream a response from the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
system_instructions | str | None | The system instructions to use. | required |
input | str |list[TResponseInputItem] | The input items to the model, in OpenAI Responses format. | required |
model_settings | ModelSettings | The model settings to use. | required |
tools | list[Tool] | The tools available to the model. | required |
output_schema | AgentOutputSchemaBase | None | The output schema to use. | required |
handoffs | list[Handoff] | The handoffs available to the model. | required |
tracing | ModelTracing | Tracing configuration. | required |
previous_response_id | str | None | the ID of the previous response. Generally not used by the model,except for the OpenAI Responses API. | required |
prompt | ResponsePromptParam | None | The prompt config to use for the model. | required |
Returns:
Type | Description |
---|---|
AsyncIterator[TResponseStreamEvent] | An iterator of response stream events, in OpenAI Responses format. |
Source code insrc/agents/models/interface.py
ModelProvider
Bases:ABC
The base interface for a model provider.
Model provider is responsible for looking up Models by name.