
Bridging AI and real-time data with tools
Have you ever wondered why you can chat with an LLM about how to solve math problems but they struggle with actual mathematical calculations? Or why they cannot reliably provide up to date information? LLMs are trained on text and they learn to predict patterns in that text. They don't have a built in calculator, so they can only approximate based on patterns they've seen, leading to errors especially with large numbers or complex calculations. The text they're trained on is also at a point in time, so asking about the current weather or today's stock prices would only be as current as the training data.
Many times, though, we'll need to augment LLM responses with real-time data from external tools and APIs. One way we can do this, is using an LLM feature called tool use (or function calling) to give them their own set of tools to use. Tool use can be applied to fetch real-time data from a database or inventory system, stock prices, weather forecasts, or even your cloud resources.
In this blog post, I'm going to show you how to integrate fetching real-time data from a weather API so you can interact with real-time data using natural language. We'll implement this in Python and usingAmazon Bedrock's Converse API, we'll define a tool for our AI to use when requested. This creates a seamless interaction between AI and a custom function, allowing us to combine the strengths of both.
Let's get started!
Prerequisites
To work through this example, you'll need a few bits set up first:
- An AWS account. You can create your accounthere.
- Request access to an AI model (we'll use Claude Sonnet) on Amazon Bedrock before you can use it. Learn about model accesshere.
- Python 3.6.0 or later setup and configured on your system.
- A python virtual environment setup with packages installed viarequirements.txt. Read more about doing thishere.
Just want the code? Grab ithere.
1. Setup
First, we'll set up the system prompt and the tool configuration for our tool.
The system prompt tells the model that it should use a tool in a specific situation -- when providing current weather data, use the Weather_Tool.
## Step 1: Define system prompt that requires the use of the tool#SYSTEM_PROMPT="""You are a weather assistant that provides current weather data for user-specified locations using onlythe Weather_Tool, which expects latitude and longitude. Infer the coordinates from the location yourself.If the user provides coordinates, infer the approximate location and refer to it in your response.To use the tool, you strictly apply the provided tool specification.- Explain your step-by-step process, and give brief updates before each step.- Only use the Weather_Tool for data. Never guess or make up information. - Repeat the tool use for subsequent requests if necessary.- If the tool errors, apologize, explain weather is unavailable, and suggest other options.- Report temperatures in °C (°F) and wind in km/h (mph). Keep weather reports concise. Sparingly use emojis where appropriate.- Only respond to weather queries. Remind off-topic users of your purpose. - Never claim to search online, access external data, or use tools besides Weather_Tool.- Complete the entire process until you have all required data before sending the complete response."""
Then we define the specification for our weather tool using the Converse API tool definition format. We describe the weather tool and defining the inputs so that the model knows how to interact with it. In this case, the tool expects a latitude and longitude of a location.
## Step 2: Define weather tool using the Converse API tool definition format#WEATHER_TOOL_SPEC={"toolSpec":{"name":"Weather_Tool","description":"Get the current weather for a given location, based on its WGS84 coordinates.","inputSchema":{"json":{"type":"object","properties":{"latitude":{"type":"string","description":"Geographical WGS84 latitude of the location.",},"longitude":{"type":"string","description":"Geographical WGS84 longitude of the location.",},},"required":["latitude","longitude"],}},}}
2. Implement the tool
Next, we'll implement the tool. This is where we'll make an API call to theOpen-Meteo API to retrieve the current weather for a given location.
## Step 3: Implement the weather tool function#deffetch_weather_data(input_data):""" Fetches weather data for the given latitude and longitude using the Open-Meteo API. Returns the weather data or an error message if the request fails. :param input_data: The input data containing the latitude and longitude. :return: The weather data or an error message."""endpoint="https://api.open-meteo.com/v1/forecast"latitude=input_data.get("latitude")longitude=input_data.get("longitude","")params={"latitude":latitude,"longitude":longitude,"current_weather":True}try:response=requests.get(endpoint,params=params)weather_data={"weather_data":response.json()}response.raise_for_status()returnweather_dataexceptRequestExceptionase:returne.response.json()exceptExceptionase:return{"error":type(e),"message":str(e)}
3. Send conversation to Amazon Bedrock
To send the conversation to Amazon Bedrock, we first create a conversation array with our initial user prompt. In theexample code repo, you can see how you might collect this directly from the user interactively.
## Step 4: Define the initial message and conversation#conversation=[]initial_message={"role":"user","content":[{"text":"What is the current weather in Minneapolis, MN?"}],}conversation.append(initial_message)
Then, we make theconverse
call, sending along the model, conversation, tool config, and our system prompt.
## Step 5: Send the message with the tool config and system prompt#response=bedrock.converse(modelId=MODEL_ID,messages=conversation,inferenceConfig={"maxTokens":2000,"temperature":0},toolConfig={"tools":[WEATHER_TOOL_SPEC]},system=[{"text":SYSTEM_PROMPT}])
4. Process the response and call the tool
Once we have a response from Amazon Bedrock, we have to process the response. We'll check to see if a tool is being requested (toolUse
is returned) and if so, we call the requested tool.
With a result from the tool, thefetch_weather_data
function, we then prepare atoolResult
to send back to Amazon Bedrock for further processing.
## Step 6: Process the response and call the tool when requested#response_message=response['output']['message']conversation.append(response_message)response_content_blocks=response_message['content']follow_up_content_blocks=[]forcontent_blockinresponse_content_blocks:if'text'incontent_block:print(content_block['text'])elif'toolUse'incontent_block:tool_use_block=content_block['toolUse']tool_use_name=tool_use_block['name']iftool_use_name=='Weather_Tool':tool_result_value=fetch_weather_data(tool_use_block['input'])follow_up_content_blocks.append({"toolResult":{"toolUseId":tool_use_block['toolUseId'],"content":[{"json":{"result":tool_result_value}}]}})# Handle unknown toolselse:follow_up_content_blocks.append({"toolResult":{"toolUseId":tool_use_block['toolUseId'],"content":[{"text":"unknown tool"+tool_use_name}],"status":"error"}})
5. Send the tool result back to Amazon Bedrock
With thetoolResult
containing the current weather data, we now send this to Amazon Bedrock so we can get the final natural language response from the model incorporating the tool result -- the current weather.
## Step 7: Send tool result back to Amazon Bedrock#iflen(follow_up_content_blocks)>0:follow_up_message={"role":"user","content":follow_up_content_blocks,}conversation.append(follow_up_message)response=bedrock.converse(modelId=MODEL_ID,messages=conversation,inferenceConfig={"maxTokens":2000,"temperature":0},toolConfig={"tools":[WEATHER_TOOL_SPEC]},system=[{"text":SYSTEM_PROMPT}])response_message=response['output']['message']print(response_message['content'][0]['text'])conversation.append(response_message)
Run the code
Finally, we're ready to run the code and find the current weather for a location through natural language conversations. Run the following at the command line or in your IDE:
python weather_tool_use_demo.py
The output we get is:
Okay,letme check the current weather conditionsforMinneapolis, MN using the Weather Tool.First, I'll need to get the latitude and longitude coordinates for Minneapolis:Minneapolis, MN coordinates: 44.9778°N, 93.2650°WNext, I'll invoke the Weather Tool with those coordinates:Here is the current weather reportforMinneapolis, MN:The temperature is 17.9°C(64.2°F) with a partly cloudy sky. Winds are blowing from the west-northwest at 19.8 km/h(12.3 mph). ☁️ 🍃Let me knowifyou need any other weather detailsforMinneapolis!
Wrapping up
In this post, you learned why LLMs struggle to make complex calculations and provide real-time information. I showed one approach for augmenting your LLM responses with data from external tools and APIs -- tool use (or function calling). Then, I showed how to implement this in Python with Amazon Bedrock's Converse API.
Other helpful resources
- The code repo for this article
- Exploring tool use with Amazon Bedrock (Instagram)
- Intro to Tool Use with the Amazon Bedrock Converse API
- Other Python examples for Amazon Bedrock
I hope this has been helpful. If you'd like more like this, smash that like button 👍, share this with your friends 👯, or drop a comment below 💬.
Top comments(0)
For further actions, you may consider blocking this person and/orreporting abuse