Robocorp Toolkit
This notebook covers how to get started withRobocorp Action Server action toolkit and LangChain.
Robocorp is the easiest way to extend the capabilities of AI agents, assistants and copilots with custom actions.
Installation
First, see theRobocorp Quickstart on how to setupAction Server
and create your Actions.
In your LangChain application, install thelangchain-robocorp
package:
# Install package
%pip install--upgrade--quiet langchain-robocorp
When you create the newAction Server
following the above quickstart.
It will create a directory with files, includingaction.py
.
We can add python function as actions as shownhere.
Let's add a dummy function toaction.py
.
@action
defget_weather_forecast(city:str, days:int, scale:str="celsius")->str:
"""
Returns weather conditions forecast for a given city.
Args:
city (str): Target city to get the weather conditions for
days: How many day forecast to return
scale (str): Temperature scale to use, should be one of "celsius" or "fahrenheit"
Returns:
str: The requested weather conditions forecast
"""
return"75F and sunny :)"
We then start the server:
action-server start
And we can see:
Found new action: get_weather_forecast
Test locally by going to the server running athttp://localhost:8080
and use the UI to run the function.
Environment Setup
Optionally you can set the following environment variables:
LANGSMITH_TRACING=true
: To enable LangSmith log run tracing that can also be bind to respective Action Server action run logs. SeeLangSmith documentation for more.
Usage
We started the local action server, above, running onhttp://localhost:8080
.
from langchain.agentsimport AgentExecutor, OpenAIFunctionsAgent
from langchain_core.messagesimport SystemMessage
from langchain_openaiimport ChatOpenAI
from langchain_robocorpimport ActionServerToolkit
# Initialize LLM chat model
llm= ChatOpenAI(model="gpt-4", temperature=0)
# Initialize Action Server Toolkit
toolkit= ActionServerToolkit(url="http://localhost:8080", report_trace=True)
tools= toolkit.get_tools()
# Initialize Agent
system_message= SystemMessage(content="You are a helpful assistant")
prompt= OpenAIFunctionsAgent.create_prompt(system_message)
agent= OpenAIFunctionsAgent(llm=llm, prompt=prompt, tools=tools)
executor= AgentExecutor(agent=agent, tools=tools, verbose=True)
executor.invoke("What is the current weather today in San Francisco in fahrenheit?")
[1m> Entering new AgentExecutor chain...[0m
[32;1m[1;3m
Invoking: `robocorp_action_server_get_weather_forecast` with `{'city': 'San Francisco', 'days': 1, 'scale': 'fahrenheit'}`
[0m[33;1m[1;3m"75F and sunny :)"[0m[32;1m[1;3mThe current weather today in San Francisco is 75F and sunny.[0m
[1m> Finished chain.[0m
{'input': 'What is the current weather today in San Francisco in fahrenheit?',
'output': 'The current weather today in San Francisco is 75F and sunny.'}
Single input tools
By defaulttoolkit.get_tools()
will return the actions as Structured Tools.
To return single input tools, pass a Chat model to be used for processing the inputs.
# Initialize single input Action Server Toolkit
toolkit= ActionServerToolkit(url="http://localhost:8080")
tools= toolkit.get_tools(llm=llm)
Related
- Toolconceptual guide
- Toolhow-to guides