Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

How to convert Runnables to Tools

Prerequisites

This guide assumes familiarity with the following concepts:

Here we will demonstrate how to convert a LangChainRunnable into a tool that can be used by agents, chains, or chat models.

Dependencies

Note: this guide requireslangchain-core >= 0.2.13. We will also useOpenAI for embeddings, but any LangChain embeddings should suffice. We will use a simpleLangGraph agent for demonstration purposes.

%%capture--no-stderr
%pip install-U langchain-core langchain-openai langgraph

LangChaintools are interfaces that an agent, chain, or chat model can use to interact with the world. Seehere for how-to guides covering tool-calling, built-in tools, custom tools, and more information.

LangChain tools-- instances ofBaseTool-- areRunnables with additional constraints that enable them to be invoked effectively by language models:

  • Their inputs are constrained to be serializable, specifically strings and Pythondict objects;
  • They contain names and descriptions indicating how and when they should be used;
  • They may contain a detailedargs_schema for their arguments. That is, while a tool (as aRunnable) might accept a singledict input, the specific keys and type information needed to populate a dict should be specified in theargs_schema.

Runnables that accept string ordict input can be converted to tools using theas_tool method, which allows for the specification of names, descriptions, and additional schema information for arguments.

Basic usage

With typeddict input:

from typingimport List

from langchain_core.runnablesimport RunnableLambda
from typing_extensionsimport TypedDict


classArgs(TypedDict):
a:int
b: List[int]


deff(x: Args)->str:
returnstr(x["a"]*max(x["b"]))


runnable= RunnableLambda(f)
as_tool= runnable.as_tool(
name="My tool",
description="Explanation of when to use tool.",
)
API Reference:RunnableLambda
print(as_tool.description)

as_tool.args_schema.model_json_schema()
Explanation of when to use tool.
{'properties': {'a': {'title': 'A', 'type': 'integer'},
'b': {'items': {'type': 'integer'}, 'title': 'B', 'type': 'array'}},
'required': ['a', 'b'],
'title': 'My tool',
'type': 'object'}
as_tool.invoke({"a":3,"b":[1,2]})
'6'

Without typing information, arg types can be specified viaarg_types:

from typingimport Any, Dict


defg(x: Dict[str, Any])->str:
returnstr(x["a"]*max(x["b"]))


runnable= RunnableLambda(g)
as_tool= runnable.as_tool(
name="My tool",
description="Explanation of when to use tool.",
arg_types={"a":int,"b": List[int]},
)

Alternatively, the schema can be fully specified by directly passing the desiredargs_schema for the tool:

from pydanticimport BaseModel, Field


classGSchema(BaseModel):
"""Apply a function to an integer and list of integers."""

a:int= Field(..., description="Integer")
b: List[int]= Field(..., description="List of ints")


runnable= RunnableLambda(g)
as_tool= runnable.as_tool(GSchema)

String input is also supported:

deff(x:str)->str:
return x+"a"


defg(x:str)->str:
return x+"z"


runnable= RunnableLambda(f)| g
as_tool= runnable.as_tool()
as_tool.invoke("b")
'baz'

In agents

Below we will incorporate LangChain Runnables as tools in anagent application. We will demonstrate with:

  • a documentretriever;
  • a simpleRAG chain, allowing an agent to delegate relevant queries to it.

We first instantiate a chat model that supportstool calling:

pip install -qU "langchain[google-genai]"
import getpass
import os

ifnot os.environ.get("GOOGLE_API_KEY"):
os.environ["GOOGLE_API_KEY"]= getpass.getpass("Enter API key for Google Gemini: ")

from langchain.chat_modelsimport init_chat_model

llm= init_chat_model("gemini-2.0-flash", model_provider="google_genai")

Following theRAG tutorial, let's first construct a retriever:

from langchain_core.documentsimport Document
from langchain_core.vectorstoresimport InMemoryVectorStore
from langchain_openaiimport OpenAIEmbeddings

documents=[
Document(
page_content="Dogs are great companions, known for their loyalty and friendliness.",
),
Document(
page_content="Cats are independent pets that often enjoy their own space.",
),
]

vectorstore= InMemoryVectorStore.from_documents(
documents, embedding=OpenAIEmbeddings()
)

retriever= vectorstore.as_retriever(
search_type="similarity",
search_kwargs={"k":1},
)

We next create use a simple pre-builtLangGraph agent and provide it the tool:

from langgraph.prebuiltimport create_react_agent

tools=[
retriever.as_tool(
name="pet_info_retriever",
description="Get information about pets.",
)
]
agent= create_react_agent(llm, tools)
API Reference:create_react_agent
for chunkin agent.stream({"messages":[("human","What are dogs known for?")]}):
print(chunk)
print("----")
{'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_W8cnfOjwqEn4cFcg19LN9mYD', 'function': {'arguments': '{"__arg1":"dogs"}', 'name': 'pet_info_retriever'}, 'type': 'function'}]}, response_metadata={'token_usage': {'completion_tokens': 19, 'prompt_tokens': 60, 'total_tokens': 79}, 'model_name': 'gpt-4o-mini', 'system_fingerprint': None, 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-d7f81de9-1fb7-4caf-81ed-16dcdb0b2ab4-0', tool_calls=[{'name': 'pet_info_retriever', 'args': {'__arg1': 'dogs'}, 'id': 'call_W8cnfOjwqEn4cFcg19LN9mYD'}], usage_metadata={'input_tokens': 60, 'output_tokens': 19, 'total_tokens': 79})]}}
----
{'tools': {'messages': [ToolMessage(content="[Document(id='86f835fe-4bbe-4ec6-aeb4-489a8b541707', page_content='Dogs are great companions, known for their loyalty and friendliness.')]", name='pet_info_retriever', tool_call_id='call_W8cnfOjwqEn4cFcg19LN9mYD')]}}
----
{'agent': {'messages': [AIMessage(content='Dogs are known for being great companions, known for their loyalty and friendliness.', response_metadata={'token_usage': {'completion_tokens': 18, 'prompt_tokens': 134, 'total_tokens': 152}, 'model_name': 'gpt-4o-mini', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-9ca5847a-a5eb-44c0-a774-84cc2c5bbc5b-0', usage_metadata={'input_tokens': 134, 'output_tokens': 18, 'total_tokens': 152})]}}
----

SeeLangSmith trace for the above run.

Going further, we can create a simpleRAG chain that takes an additional parameter-- here, the "style" of the answer.

from operatorimport itemgetter

from langchain_core.output_parsersimport StrOutputParser
from langchain_core.promptsimport ChatPromptTemplate
from langchain_core.runnablesimport RunnablePassthrough

system_prompt="""
You are an assistant for question-answering tasks.
Use the below context to answer the question. If
you don't know the answer, say you don't know.
Use three sentences maximum and keep the answer
concise.

Answer in the style of {answer_style}.

Question: {question}

Context: {context}
"""

prompt= ChatPromptTemplate.from_messages([("system", system_prompt)])

rag_chain=(
{
"context": itemgetter("question")| retriever,
"question": itemgetter("question"),
"answer_style": itemgetter("answer_style"),
}
| prompt
| llm
| StrOutputParser()
)

Note that the input schema for our chain contains the required arguments, so it converts to a tool without further specification:

rag_chain.input_schema.model_json_schema()
{'properties': {'question': {'title': 'Question'},
'answer_style': {'title': 'Answer Style'}},
'required': ['question', 'answer_style'],
'title': 'RunnableParallel<context,question,answer_style>Input',
'type': 'object'}
rag_tool= rag_chain.as_tool(
name="pet_expert",
description="Get information about pets.",
)

Below we again invoke the agent. Note that the agent populates the required parameters in itstool_calls:

agent= create_react_agent(llm,[rag_tool])

for chunkin agent.stream(
{"messages":[("human","What would a pirate say dogs are known for?")]}
):
print(chunk)
print("----")
{'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_17iLPWvOD23zqwd1QVQ00Y63', 'function': {'arguments': '{"question":"What are dogs known for according to pirates?","answer_style":"quote"}', 'name': 'pet_expert'}, 'type': 'function'}]}, response_metadata={'token_usage': {'completion_tokens': 28, 'prompt_tokens': 59, 'total_tokens': 87}, 'model_name': 'gpt-4o-mini', 'system_fingerprint': None, 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-7fef44f3-7bba-4e63-8c51-2ad9c5e65e2e-0', tool_calls=[{'name': 'pet_expert', 'args': {'question': 'What are dogs known for according to pirates?', 'answer_style': 'quote'}, 'id': 'call_17iLPWvOD23zqwd1QVQ00Y63'}], usage_metadata={'input_tokens': 59, 'output_tokens': 28, 'total_tokens': 87})]}}
----
{'tools': {'messages': [ToolMessage(content='"Dogs are known for their loyalty and friendliness, making them great companions for pirates on long sea voyages."', name='pet_expert', tool_call_id='call_17iLPWvOD23zqwd1QVQ00Y63')]}}
----
{'agent': {'messages': [AIMessage(content='According to pirates, dogs are known for their loyalty and friendliness, making them great companions for pirates on long sea voyages.', response_metadata={'token_usage': {'completion_tokens': 27, 'prompt_tokens': 119, 'total_tokens': 146}, 'model_name': 'gpt-4o-mini', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-5a30edc3-7be0-4743-b980-ca2f8cad9b8d-0', usage_metadata={'input_tokens': 119, 'output_tokens': 27, 'total_tokens': 146})]}}
----

SeeLangSmith trace for the above run.


[8]ページ先頭

©2009-2025 Movatter.jp