Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

The official Python SDK for Model Context Protocol servers and clients

License

NotificationsYou must be signed in to change notification settings

FFNcode/python-sdk

 
 

Repository files navigation

Python implementation of the Model Context Protocol (MCP)

PyPIMIT licensedPython VersionDocumentationSpecificationGitHub Discussions

Table of Contents

Overview

The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:

  • Build MCP clients that can connect to any MCP server
  • Create MCP servers that expose resources, prompts and tools
  • Use standard transports like stdio and SSE
  • Handle all MCP protocol messages and lifecycle events

Installation

Adding MCP to your python project

We recommend usinguv to manage your Python projects.

If you haven't created a uv-managed project yet, create one:

uv init mcp-server-democd mcp-server-demo

Then add MCP to your project dependencies:

uv add"mcp[cli]"

Alternatively, for projects using pip for dependencies:

pip install"mcp[cli]"

Running the standalone MCP development tools

To run the mcp command with uv:

uv run mcp

Quickstart

Let's create a simple MCP server that exposes a calculator tool and some data:

# server.pyfrommcp.server.fastmcpimportFastMCP# Create an MCP servermcp=FastMCP("Demo")# Add an addition tool@mcp.tool()defadd(a:int,b:int)->int:"""Add two numbers"""returna+b# Add a dynamic greeting resource@mcp.resource("greeting://{name}")defget_greeting(name:str)->str:"""Get a personalized greeting"""returnf"Hello,{name}!"

You can install this server inClaude Desktop and interact with it right away by running:

mcp install server.py

Alternatively, you can test it with the MCP Inspector:

mcp dev server.py

What is MCP?

TheModel Context Protocol (MCP) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:

  • Expose data throughResources (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
  • Provide functionality throughTools (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
  • Define interaction patterns throughPrompts (reusable templates for LLM interactions)
  • And more!

Core Concepts

Server

The FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:

# Add lifespan support for startup/shutdown with strong typingfromcontextlibimportasynccontextmanagerfromcollections.abcimportAsyncIteratorfromdataclassesimportdataclassfromfake_databaseimportDatabase# Replace with your actual DB typefrommcp.server.fastmcpimportContext,FastMCP# Create a named servermcp=FastMCP("My App")# Specify dependencies for deployment and developmentmcp=FastMCP("My App",dependencies=["pandas","numpy"])@dataclassclassAppContext:db:Database@asynccontextmanagerasyncdefapp_lifespan(server:FastMCP)->AsyncIterator[AppContext]:"""Manage application lifecycle with type-safe context"""# Initialize on startupdb=awaitDatabase.connect()try:yieldAppContext(db=db)finally:# Cleanup on shutdownawaitdb.disconnect()# Pass lifespan to servermcp=FastMCP("My App",lifespan=app_lifespan)# Access type-safe lifespan context in tools@mcp.tool()defquery_db(ctx:Context)->str:"""Tool that uses initialized resources"""db=ctx.request_context.lifespan_context.dbreturndb.query()

Resources

Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:

frommcp.server.fastmcpimportFastMCPmcp=FastMCP("My App")@mcp.resource("config://app")defget_config()->str:"""Static configuration data"""return"App configuration here"@mcp.resource("users://{user_id}/profile")defget_user_profile(user_id:str)->str:"""Dynamic user data"""returnf"Profile data for user{user_id}"

Tools

Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:

importhttpxfrommcp.server.fastmcpimportFastMCPmcp=FastMCP("My App")@mcp.tool()defcalculate_bmi(weight_kg:float,height_m:float)->float:"""Calculate BMI given weight in kg and height in meters"""returnweight_kg/ (height_m**2)@mcp.tool()asyncdeffetch_weather(city:str)->str:"""Fetch current weather for a city"""asyncwithhttpx.AsyncClient()asclient:response=awaitclient.get(f"https://api.weather.com/{city}")returnresponse.text

Prompts

Prompts are reusable templates that help LLMs interact with your server effectively:

frommcp.server.fastmcpimportFastMCPfrommcp.server.fastmcp.promptsimportbasemcp=FastMCP("My App")@mcp.prompt()defreview_code(code:str)->str:returnf"Please review this code:\n\n{code}"@mcp.prompt()defdebug_error(error:str)->list[base.Message]:return [base.UserMessage("I'm seeing this error:"),base.UserMessage(error),base.AssistantMessage("I'll help debug that. What have you tried so far?"),    ]

Images

FastMCP provides anImage class that automatically handles image data:

frommcp.server.fastmcpimportFastMCP,ImagefromPILimportImageasPILImagemcp=FastMCP("My App")@mcp.tool()defcreate_thumbnail(image_path:str)->Image:"""Create a thumbnail from an image"""img=PILImage.open(image_path)img.thumbnail((100,100))returnImage(data=img.tobytes(),format="png")

Context

The Context object gives your tools and resources access to MCP capabilities:

frommcp.server.fastmcpimportFastMCP,Contextmcp=FastMCP("My App")@mcp.tool()asyncdeflong_task(files:list[str],ctx:Context)->str:"""Process multiple files with progress tracking"""fori,fileinenumerate(files):ctx.info(f"Processing{file}")awaitctx.report_progress(i,len(files))data,mime_type=awaitctx.read_resource(f"file://{file}")return"Processing complete"

Running Your Server

Development Mode

The fastest way to test and debug your server is with the MCP Inspector:

mcp dev server.py# Add dependenciesmcp dev server.py --with pandas --with numpy# Mount local codemcp dev server.py --with-editable.

Claude Desktop Integration

Once your server is ready, install it in Claude Desktop:

mcp install server.py# Custom namemcp install server.py --name"My Analytics Server"# Environment variablesmcp install server.py -v API_KEY=abc123 -v DB_URL=postgres://...mcp install server.py -f .env

Direct Execution

For advanced scenarios like custom deployments:

frommcp.server.fastmcpimportFastMCPmcp=FastMCP("My App")if__name__=="__main__":mcp.run()

Run it with:

python server.py# ormcp run server.py

Mounting to an Existing ASGI Server

You can mount the SSE server to an existing ASGI server using thesse_app method. This allows you to integrate the SSE server with other ASGI applications.

fromstarlette.applicationsimportStarlettefromstarlette.routingimportMount,Hostfrommcp.server.fastmcpimportFastMCPmcp=FastMCP("My App")# Mount the SSE server to the existing ASGI serverapp=Starlette(routes=[Mount('/',app=mcp.sse_app()),    ])# or dynamically mount as hostapp.router.routes.append(Host('mcp.acme.corp',app=mcp.sse_app()))

For more information on mounting applications in Starlette, see theStarlette documentation.

Examples

Echo Server

A simple server demonstrating resources, tools, and prompts:

frommcp.server.fastmcpimportFastMCPmcp=FastMCP("Echo")@mcp.resource("echo://{message}")defecho_resource(message:str)->str:"""Echo a message as a resource"""returnf"Resource echo:{message}"@mcp.tool()defecho_tool(message:str)->str:"""Echo a message as a tool"""returnf"Tool echo:{message}"@mcp.prompt()defecho_prompt(message:str)->str:"""Create an echo prompt"""returnf"Please process this message:{message}"

SQLite Explorer

A more complex example showing database integration:

importsqlite3frommcp.server.fastmcpimportFastMCPmcp=FastMCP("SQLite Explorer")@mcp.resource("schema://main")defget_schema()->str:"""Provide the database schema as a resource"""conn=sqlite3.connect("database.db")schema=conn.execute("SELECT sql FROM sqlite_master WHERE type='table'").fetchall()return"\n".join(sql[0]forsqlinschemaifsql[0])@mcp.tool()defquery_data(sql:str)->str:"""Execute SQL queries safely"""conn=sqlite3.connect("database.db")try:result=conn.execute(sql).fetchall()return"\n".join(str(row)forrowinresult)exceptExceptionase:returnf"Error:{str(e)}"

Advanced Usage

Low-Level Server

For more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server, including lifecycle management through the lifespan API:

fromcontextlibimportasynccontextmanagerfromcollections.abcimportAsyncIteratorfromfake_databaseimportDatabase# Replace with your actual DB typefrommcp.serverimportServer@asynccontextmanagerasyncdefserver_lifespan(server:Server)->AsyncIterator[dict]:"""Manage server startup and shutdown lifecycle."""# Initialize resources on startupdb=awaitDatabase.connect()try:yield {"db":db}finally:# Clean up on shutdownawaitdb.disconnect()# Pass lifespan to serverserver=Server("example-server",lifespan=server_lifespan)# Access lifespan context in handlers@server.call_tool()asyncdefquery_db(name:str,arguments:dict)->list:ctx=server.request_contextdb=ctx.lifespan_context["db"]returnawaitdb.query(arguments["query"])

The lifespan API provides:

  • A way to initialize resources when the server starts and clean them up when it stops
  • Access to initialized resources through the request context in handlers
  • Type-safe context passing between lifespan and request handlers
importmcp.server.stdioimportmcp.typesastypesfrommcp.server.lowlevelimportNotificationOptions,Serverfrommcp.server.modelsimportInitializationOptions# Create a server instanceserver=Server("example-server")@server.list_prompts()asyncdefhandle_list_prompts()->list[types.Prompt]:return [types.Prompt(name="example-prompt",description="An example prompt template",arguments=[types.PromptArgument(name="arg1",description="Example argument",required=True                )            ],        )    ]@server.get_prompt()asyncdefhandle_get_prompt(name:str,arguments:dict[str,str]|None)->types.GetPromptResult:ifname!="example-prompt":raiseValueError(f"Unknown prompt:{name}")returntypes.GetPromptResult(description="Example prompt",messages=[types.PromptMessage(role="user",content=types.TextContent(type="text",text="Example prompt text"),            )        ],    )asyncdefrun():asyncwithmcp.server.stdio.stdio_server()as (read_stream,write_stream):awaitserver.run(read_stream,write_stream,InitializationOptions(server_name="example",server_version="0.1.0",capabilities=server.get_capabilities(notification_options=NotificationOptions(),experimental_capabilities={},                ),            ),        )if__name__=="__main__":importasyncioasyncio.run(run())

Writing MCP Clients

The SDK provides a high-level client interface for connecting to MCP servers:

frommcpimportClientSession,StdioServerParameters,typesfrommcp.client.stdioimportstdio_client# Create server parameters for stdio connectionserver_params=StdioServerParameters(command="python",# Executableargs=["example_server.py"],# Optional command line argumentsenv=None,# Optional environment variables)# Optional: create a sampling callbackasyncdefhandle_sampling_message(message:types.CreateMessageRequestParams,)->types.CreateMessageResult:returntypes.CreateMessageResult(role="assistant",content=types.TextContent(type="text",text="Hello, world! from model",        ),model="gpt-3.5-turbo",stopReason="endTurn",    )asyncdefrun():asyncwithstdio_client(server_params)as (read,write):asyncwithClientSession(read,write,sampling_callback=handle_sampling_message        )assession:# Initialize the connectionawaitsession.initialize()# List available promptsprompts=awaitsession.list_prompts()# Get a promptprompt=awaitsession.get_prompt("example-prompt",arguments={"arg1":"value"}            )# List available resourcesresources=awaitsession.list_resources()# List available toolstools=awaitsession.list_tools()# Read a resourcecontent,mime_type=awaitsession.read_resource("file://some/path")# Call a toolresult=awaitsession.call_tool("tool-name",arguments={"arg1":"value"})if__name__=="__main__":importasyncioasyncio.run(run())

MCP Primitives

The MCP protocol defines three core primitives that servers can implement:

PrimitiveControlDescriptionExample Use
PromptsUser-controlledInteractive templates invoked by user choiceSlash commands, menu options
ResourcesApplication-controlledContextual data managed by the client applicationFile contents, API responses
ToolsModel-controlledFunctions exposed to the LLM to take actionsAPI calls, data updates

Server Capabilities

MCP servers declare capabilities during initialization:

CapabilityFeature FlagDescription
promptslistChangedPrompt template management
resourcessubscribe
listChanged
Resource exposure and updates
toolslistChangedTool discovery and execution
logging-Server logging configuration
completion-Argument completion suggestions

Documentation

Contributing

We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See thecontributing guide to get started.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

The official Python SDK for Model Context Protocol servers and clients

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python100.0%

[8]ページ先頭

©2009-2025 Movatter.jp