How to add a human-in-the-loop for tools
There are certain tools that we don't trust a model to execute on its own. One thing we can do in such situations is require human approval before the tool is invoked.
This how-to guide shows a simple way to add human-in-the-loop for code running in a jupyter notebook or in a terminal.
To build a production application, you will need to do more work to keep track of application state appropriately.
We recommend usinglanggraph
for powering such a capability. For more details, please see thisguide.
Setup
We'll need to install the following packages:
%pip install--upgrade--quiet langchain
And set these environment variables:
import getpass
import os
# If you'd like to use LangSmith, uncomment the below:
# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass()
Chain
Let's create a few simple (dummy) tools and a tool-calling chain:
pip install -qU "langchain[google-genai]"
import getpass
import os
ifnot os.environ.get("GOOGLE_API_KEY"):
os.environ["GOOGLE_API_KEY"]= getpass.getpass("Enter API key for Google Gemini: ")
from langchain.chat_modelsimport init_chat_model
llm= init_chat_model("gemini-2.0-flash", model_provider="google_genai")
from typingimport Dict, List
from langchain_core.messagesimport AIMessage
from langchain_core.runnablesimport Runnable, RunnablePassthrough
from langchain_core.toolsimport tool
@tool
defcount_emails(last_n_days:int)->int:
"""Dummy function to count number of e-mails. Returns 2 * last_n_days."""
return last_n_days*2
@tool
defsend_email(message:str, recipient:str)->str:
"""Dummy function for sending an e-mail."""
returnf"Successfully sent email to{recipient}."
tools=[count_emails, send_email]
llm_with_tools= llm.bind_tools(tools)
defcall_tools(msg: AIMessage)-> List[Dict]:
"""Simple sequential tool calling helper."""
tool_map={tool.name: toolfor toolin tools}
tool_calls= msg.tool_calls.copy()
for tool_callin tool_calls:
tool_call["output"]= tool_map[tool_call["name"]].invoke(tool_call["args"])
return tool_calls
chain= llm_with_tools| call_tools
chain.invoke("how many emails did i get in the last 5 days?")
[{'name': 'count_emails',
'args': {'last_n_days': 5},
'id': 'toolu_01QYZdJ4yPiqsdeENWHqioFW',
'output': 10}]
Adding human approval
Let's add a step in the chain that will ask a person to approve or reject the tool call request.
On rejection, the step will raise an exception which will stop execution of the rest of the chain.
import json
classNotApproved(Exception):
"""Custom exception."""
defhuman_approval(msg: AIMessage)-> AIMessage:
"""Responsible for passing through its input or raising an exception.
Args:
msg: output from the chat model
Returns:
msg: original output from the msg
"""
tool_strs="\n\n".join(
json.dumps(tool_call, indent=2)for tool_callin msg.tool_calls
)
input_msg=(
f"Do you approve of the following tool invocations\n\n{tool_strs}\n\n"
"Anything except 'Y'/'Yes' (case-insensitive) will be treated as a no.\n >>>"
)
resp=input(input_msg)
if resp.lower()notin("yes","y"):
raise NotApproved(f"Tool invocations not approved:\n\n{tool_strs}")
return msg
chain= llm_with_tools| human_approval| call_tools
chain.invoke("how many emails did i get in the last 5 days?")
Do you approve of the following tool invocations
{
"name": "count_emails",
"args": {
"last_n_days": 5
},
"id": "toolu_01WbD8XeMoQaRFtsZezfsHor"
}
Anything except 'Y'/'Yes' (case-insensitive) will be treated as a no.
>>> yes
[{'name': 'count_emails',
'args': {'last_n_days': 5},
'id': 'toolu_01WbD8XeMoQaRFtsZezfsHor',
'output': 10}]
try:
chain.invoke("Send sally@gmail.com an email saying 'What's up homie'")
except NotApprovedas e:
print()
print(e)
Do you approve of the following tool invocations
{
"name": "send_email",
"args": {
"recipient": "sally@gmail.com",
"message": "What's up homie"
},
"id": "toolu_014XccHFzBiVcc9GV1harV9U"
}
Anything except 'Y'/'Yes' (case-insensitive) will be treated as a no.
>>> no
``````output
Tool invocations not approved:
{
"name": "send_email",
"args": {
"recipient": "sally@gmail.com",
"message": "What's up homie"
},
"id": "toolu_014XccHFzBiVcc9GV1harV9U"
}