Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Python SDK for interacting with the MCP Toolbox for Databases.

License

NotificationsYou must be signed in to change notification settings

googleapis/mcp-toolbox-sdk-python

 
 

This SDK allows you to seamlessly integrate the functionalities ofToolbox into your LangChain LLMapplications, enabling advanced orchestration and interaction with GenAI models.

Table of Contents

Installation

pip install toolbox-langchain

Quickstart

Here's a minimal example to get you started usingLangGraph:

fromtoolbox_langchainimportToolboxClientfromlangchain_google_vertexaiimportChatVertexAIfromlanggraph.prebuiltimportcreate_react_agenttoolbox=ToolboxClient("http://127.0.0.1:5000")tools=toolbox.load_toolset()model=ChatVertexAI(model="gemini-1.5-pro-002")agent=create_react_agent(model,tools)prompt="How's the weather today?"forsinagent.stream({"messages": [("user",prompt)]},stream_mode="values"):message=s["messages"][-1]ifisinstance(message,tuple):print(message)else:message.pretty_print()

Usage

Import and initialize the toolbox client.

fromtoolbox_langchainimportToolboxClient# Replace with your Toolbox service's URLtoolbox=ToolboxClient("http://127.0.0.1:5000")

Loading Tools

Load a toolset

A toolset is a collection of related tools. You can load all tools in a toolsetor a specific one:

# Load all toolstools=toolbox.load_toolset()# Load a specific toolsettools=toolbox.load_toolset("my-toolset")

Load a single tool

tool=toolbox.load_tool("my-tool")

Loading individual tools gives you finer-grained control over which tools areavailable to your LLM agent.

Use with LangChain

LangChain's agents can dynamically choose and execute tools based on the userinput. Include tools loaded from the Toolbox SDK in the agent's toolkit:

fromlangchain_google_vertexaiimportChatVertexAImodel=ChatVertexAI(model="gemini-1.5-pro-002")# Initialize agent with toolsagent=model.bind_tools(tools)# Run the agentresult=agent.invoke("Do something with the tools")

Use with LangGraph

Integrate the Toolbox SDK with LangGraph to use Toolbox service tools within agraph-based workflow. Follow theofficialguide with minimal changes.

Represent Tools as Nodes

Represent each tool as a LangGraph node, encapsulating the tool's execution within the node's functionality:

fromtoolbox_langchainimportToolboxClientfromlanggraph.graphimportStateGraph,MessagesStatefromlanggraph.prebuiltimportToolNode# Define the function that calls the modeldefcall_model(state:MessagesState):messages=state['messages']response=model.invoke(messages)return {"messages": [response]}# Return a list to add to existing messagesmodel=ChatVertexAI(model="gemini-1.5-pro-002")builder=StateGraph(MessagesState)tool_node=ToolNode(tools)builder.add_node("agent",call_model)builder.add_node("tools",tool_node)

Connect Tools with LLM

Connect tool nodes with LLM nodes. The LLM decides which tool to use based oninput or context. Tool output can be fed back into the LLM:

fromtypingimportLiteralfromlanggraph.graphimportEND,STARTfromlangchain_core.messagesimportHumanMessage# Define the function that determines whether to continue or notdefshould_continue(state:MessagesState)->Literal["tools",END]:messages=state['messages']last_message=messages[-1]iflast_message.tool_calls:return"tools"# Route to "tools" node if LLM makes a tool callreturnEND# Otherwise, stopbuilder.add_edge(START,"agent")builder.add_conditional_edges("agent",should_continue)builder.add_edge("tools",'agent')graph=builder.compile()graph.invoke({"messages": [HumanMessage(content="Do something with the tools")]})

Manual usage

Execute a tool manually using theinvoke method:

result=tools[0].invoke({"name":"Alice","age":30})

This is useful for testing tools or when you need precise control over toolexecution outside of an agent framework.

Authenticating Tools

Warning

Always use HTTPS to connect your application with the Toolbox service,especially when using tools with authentication configured. Using HTTP exposesyour application to serious security risks.

Some tools require user authentication to access sensitive data.

Supported Authentication Mechanisms

Toolbox currently supports authentication using theOIDCprotocol withIDtokens (notaccess tokens) forGoogle OAuth2.0.

Configure Tools

Refer totheseinstructions onconfiguring tools for authenticated parameters.

Configure SDK

You need a method to retrieve an ID token from your authentication service:

asyncdefget_auth_token():# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)# This example just returns a placeholder. Replace with your actual token retrieval.return"YOUR_ID_TOKEN"# Placeholder

Add Authentication to a Tool

toolbox=ToolboxClient("http://127.0.0.1:5000")tools=toolbox.load_toolset()auth_tool=tools[0].add_auth_token("my_auth",get_auth_token)# Single tokenmulti_auth_tool=tools[0].add_auth_tokens({"my_auth",get_auth_token})# Multiple tokens# ORauth_tools= [tool.add_auth_token("my_auth",get_auth_token)fortoolintools]

Add Authentication While Loading

auth_tool=toolbox.load_tool(auth_tokens={"my_auth":get_auth_token})auth_tools=toolbox.load_toolset(auth_tokens={"my_auth":get_auth_token})

Note

Adding auth tokens during loading only affect the tools loaded withinthat call.

Complete Example

importasynciofromtoolbox_langchainimportToolboxClientasyncdefget_auth_token():# ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)# This example just returns a placeholder. Replace with your actual token retrieval.return"YOUR_ID_TOKEN"# Placeholdertoolbox=ToolboxClient("http://127.0.0.1:5000")tool=toolbox.load_tool("my-tool")auth_tool=tool.add_auth_token("my_auth",get_auth_token)result=auth_tool.invoke({"input":"some input"})print(result)

Binding Parameter Values

Predetermine values for tool parameters using the SDK. These values won't bemodified by the LLM. This is useful for:

  • Protecting sensitive information: API keys, secrets, etc.
  • Enforcing consistency: Ensuring specific values for certain parameters.
  • Pre-filling known data: Providing defaults or context.

Binding Parameters to a Tool

toolbox=ToolboxClient("http://127.0.0.1:5000")tools=toolbox.load_toolset()bound_tool=tool[0].bind_param("param","value")# Single parammulti_bound_tool=tools[0].bind_params({"param1":"value1","param2":"value2"})# Multiple params# ORbound_tools= [tool.bind_param("param","value")fortoolintools]

Binding Parameters While Loading

bound_tool=toolbox.load_tool(bound_params={"param":"value"})bound_tools=toolbox.load_toolset(bound_params={"param":"value"})

Note

Bound values during loading only affect the tools loaded in that call.

Binding Dynamic Values

Use a function to bind dynamic values:

defget_dynamic_value():# Logic to determine the valuereturn"dynamic_value"dynamic_bound_tool=tool.bind_param("param",get_dynamic_value)

Important

You don't need to modify tool configurations to bind parameter values.

Asynchronous Usage

For better performance throughcooperativemultitasking, you canuse the asynchronous interfaces of theToolboxClient.

Note

Asynchronous interfaces likeaload_tool andaload_toolset require anasynchronous environment. For guidance on running asynchronous Pythonprograms, seeasynciodocumentation.

importasynciofromtoolbox_langchainimportToolboxClientasyncdefmain():toolbox=ToolboxClient("http://127.0.0.1:5000")tool=awaitclient.aload_tool("my-tool")tools=awaitclient.aload_toolset()response=awaittool.ainvoke()if__name__=="__main__":asyncio.run(main())

[8]ページ先頭

©2009-2025 Movatter.jp