- Notifications
You must be signed in to change notification settings - Fork19
An MCP extension package for OpenAI Agents SDK
License
lastmile-ai/openai-agents-mcp
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This package extends theOpenAI Agents SDK to add support for Model Context Protocol (MCP) servers. With this extension, you can seamlessly use MCP servers and their tools with the OpenAI Agents SDK.
The project is built using themcp-agent library.
- Connect OpenAI Agents to MCP servers
- Access tools from MCP servers alongside native OpenAI Agent SDK tools
- Configure MCP servers via standard configuration files
- Automatic tool discovery and conversion from MCP to Agent SDK format
uv add openai-agents-mcp
pip install openai-agents-mcp
Tip
Theexamples
directory has several example applications to get started with.To run an example, clone this repo, then:
cd examplescp mcp_agent.secrets.yaml.example mcp_agent.secrets.yaml# Update API keys if neededuv run hello_world_mcp.py# Or any other example
In order to use Agents SDK with MCP, simply replace the following import:
- from agents import Agent+ from agents_mcp import Agent
With that you can instantiate an Agent withmcp_servers
in addition totools
(which continue to work like before).
fromagents_mcpimportAgent# Create an agent with specific MCP servers you want to use# These must be defined in your mcp_agent.config.yaml fileagent=Agent(name="MCP Agent",instructions="""You are a helpful assistant with access to both local/OpenAI tools and tools from MCP servers. Use these tools to help the user.""",# Local/OpenAI toolstools=[get_current_weather],# Specify which MCP servers to use# These must be defined in your mcp_agent configmcp_servers=["fetch","filesystem"], )
Then define anmcp_agent.config.yaml
, with the MCP server configuration:
mcp:servers:fetch:command:npxargs:["-y", "@modelcontextprotocol/server-fetch"]filesystem:command:npxargs:["-y", "@modelcontextprotocol/server-filesystem", "."]
That's it! The rest of the Agents SDK works exactly as before.
Head over to theexamples directory to see MCP servers in action with Agents SDK.
openai_agents_mcp.mov
More details and nuances below.
You can specify the names of MCP servers to give an Agent access to bysetting itsmcp_servers
property.
The Agent will then automatically aggregate tools from the servers, as well asanytools
specified, and create a single extended list of tools. This means you can seamlesslyuse local tools, MCP servers, and other kinds of Agent SDK tools through a single unified syntax.
agent=Agent(name="MCP Assistant",instructions="You are a helpful assistant with access to MCP tools.",tools=[your_other_tools],# Regular tool use for Agent SDKmcp_servers=["fetch","filesystem"]# Names of MCP servers from your config file (see below))
Configure MCP servers by creating anmcp_agent.config.yaml
file. You can place this file in your project directory or any parent directory.
Here's an example configuration file that defines three MCP servers:
$schema:"https://raw.githubusercontent.com/lastmile-ai/mcp-agent/main/schema/mcp-agent.config.schema.json"mcp:servers:fetch:command:"uvx"args:["mcp-server-fetch"]filesystem:command:"npx"args:["-y", "@modelcontextprotocol/server-filesystem", "."]slack:command:"npx"args:["-y", "@modelcontextprotocol/server-slack"]
For servers that require sensitive information like API keys, you can:
- Define them directly in the config file (not recommended for production)
- Use a separate
mcp_agent.secrets.yaml
file (more secure) - Set them as environment variables
This extension supports several ways to configure MCP servers:
The simplest approach lets the SDK automatically find your configuration file if it's namedmcp_agent.config.yaml
andmcp_agent.secrets.yaml
:
fromagents_mcpimportAgent,RunnerContext# Create an agent that references MCP serversagent=Agent(name="MCP Assistant",instructions="You are a helpful assistant with access to MCP tools.",mcp_servers=["fetch","filesystem"]# Names of servers from your config file)result=awaitRunner.run(agent,input="Hello world",context=RunnerContext())
You can explicitly specify the path to your config file:
fromagents_mcpimportRunnerContextcontext=RunnerContext(mcp_config_path="/path/to/mcp_agent.config.yaml")
You can programmatically define your MCP settings:
frommcp_agent.configimportMCPSettings,MCPServerSettingsfromagents_mcpimportRunnerContext# Define MCP config programmaticallymcp_config=MCPSettings(servers={"fetch":MCPServerSettings(command="uvx",args=["mcp-server-fetch"] ),"filesystem":MCPServerSettings(command="npx",args=["-y","@modelcontextprotocol/server-filesystem","."] ) })context=RunnerContext(mcp_config=mcp_config)
You can create and configure your own MCP server registry:
frommcp_agent.mcp_server_registryimportServerRegistryfrommcp_agent.configimportget_settingsfromagents_mcpimportAgent# Create a custom server registrysettings=get_settings("/path/to/config.yaml")server_registry=ServerRegistry(config=settings)# Create an agent with this registryagent=Agent(name="Custom Registry Agent",instructions="You have access to custom MCP servers.",mcp_servers=["fetch","filesystem"],mcp_server_registry=server_registry# Use custom registry)
A simple example demonstrating how to create an agent that uses MCP tools:
fromagents_mcpimportAgent,RunnerContext# Create an agent with MCP serversagent=Agent(name="MCP Assistant",instructions="You are a helpful assistant with access to tools.",tools=[get_current_weather],# Local toolsmcp_servers=["fetch","filesystem"],# MCP servers)# Run the agentresult=awaitRunner.run(agent,input="What's the weather in Miami? Also, can you fetch the OpenAI website?",context=RunnerContext(),)print(result.response.value)
Seehello_world_mcp.py for the complete example.
To stream responses instead of waiting for the complete result:
result=Runner.run_streamed(# Note: No await hereagent,input="Print the first paragraph of https://openai.github.io/openai-agents-python/",context=context,)# Stream the eventsasyncforeventinresult.stream_events():ifevent.type=="raw_response_event"andisinstance(event.data,ResponseTextDeltaEvent):print(event.data.delta,end="",flush=True)
Seehello_world_mcp_streamed.py for the complete example.
This project is made possible thanks to the following projects:
MIT
About
An MCP extension package for OpenAI Agents SDK
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Languages
- Python99.6%
- Makefile0.4%