Movatterモバイル変換


[0]ホーム

URL:


Skip to content
DEV Community
Log in Create account

DEV Community

Cover image for OpenAI Function Calling
Liam Stone
Liam Stone

Posted on

     

OpenAI Function Calling

Image Credit:OpenAI

Hello, fellow coders! If you've been exploring the world of AI and chatbots, you've likely heard about OpenAI's amazing language model, GPT-4, and its counterpart GPT-3.5 Turbo. They're powerful tools for transforming the way we interact with technology.

In this post, we're diving into one of their fascinating features: function calling. We'll demystify what it is, why it's useful, and how to use it, even if you're a beginner. So grab a cup of coffee, sit back, and let's get started!

What is Function Calling?

Function calling in the context of GPT-4 and GPT-3.5 Turbo is the ability for these models to understand and generate JSON objects for specific function calls based on user queries. This doesn't mean the model is executing the function, but it's providing you with the necessary information to call the function in your own code.

Why is Function Calling Useful?

This feature opens up a world of possibilities. You can:

  • Create chatbots that answer questions by calling external APIs (like a weather API, for instance).
  • Convert natural language into API calls (imagine turning "Who are my top customers?" into an actual API call).
  • Extract structured data from a block of text.

And that's just scratching the surface!

How to Use Function Calling

Function calling involves four main steps:

  1. Call the model with the user query and a set of functions. You describe the functions you want the model to consider when analyzing the user's input.
  2. Check if the model generates a JSON object for a function call. If the model thinks a function needs to be called based on the user query, it will generate a JSON object.
  3. Parse the JSON and call your function. Take the output from the model and use it to call your function with the appropriate arguments.
  4. Call the model again with the function response. Let the model summarize the results back to the user.

Let's take a look at a Python example:

pythonimport openaiimport json# A dummy function that always returns the same weather informationdef get_current_weather(location, unit="fahrenheit"):    weather_info = {        "location": location,        "temperature": "72",        "unit": unit,        "forecast": ["sunny", "windy"],    }    return json.dumps(weather_info)def run_conversation():    messages = [{"role": "user", "content": "What's the weather like in Boston?"}]    functions = [        {            "name": "get_current_weather",            "description": "Get the current weather in a given location",            "parameters": {                "type": "object",                "properties": {                    "location": {                        "type": "string",                        "description": "The city and state, e.g. San Francisco, CA",                    },                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},                },                "required": ["location"],            },        }    ]    response = openai.ChatCompletion.create(        model="gpt-3.5-turbo-0613",        messages=messages,        functions=functions,        function_call="auto",    )    response_message = response["choices"][0]["message"]    if response_message.get("function_call"):        available_functions = {"get_current_weather": get_current_weather}        function_name = response_message["function_call"]["name"]        function_to_call = available_functions[function_name]        function_args = json.loads(response_message["function_call"]["arguments"])        function_response = function_to_call(            location=function_args.get("location"),            unit=function_args.get("unit"),        )        messages.append(response_message)        messages.append(            {"role": "function", "name": function_name, "content": function_response}        )        second_response = openai.ChatCompletion.create(            model="gpt-3.5-turbo-0613",            messages=messages,        )        return second_responseprint(run_conversation())
Enter fullscreen modeExit fullscreen mode

Example courtesyOpenAI

This script mimics a chatbot interaction with a user asking about the weather in Boston. The run_conversation function handles the conversation, using the function calling feature of GPT-3.5 Turbo.

Handling Hallucinated Outputs

Sometimes, the model might generate function calls that weren't provided to it - we call these hallucinated outputs. To mitigate this, use a system message to remind the model to only use the functions it has been provided with.

Conclusion

That's it! With this simple introduction, you are now ready to explore the world of function calling in GPT-4 and GPT-3.5 Turbo. It's a powerful tool that can help you build more advanced and interactive chatbots or data extraction methods. So don't wait - start coding and see where these amazing tools can take you!

Top comments(7)

Subscribe
pic
Create template

Templates let you quickly answer FAQs or store snippets for re-use.

Dismiss
CollapseExpand
 
exec profile image
exec
AI dreamer
  • Education
    the bourne again shell
  • Pronouns
    He/Him
  • Work
    pripyat.org
  • Joined

Hey, awesome post! Love how you broke down function calling and its utility.

The way these AI models can tap into the broader digital world with function calling is mind-blowing. The proficiency of thegpt-4-0613 model at effectively utilizing function calls never ceases to amaze me. For instance, I've developed abot that can do cool stuff on GitHub - creating repos, tweaking code, all directed by natural language, using function calls. It's wild to see what's possible!

Anyway, just wanted to say thanks for shedding light on this. I'm hoping lots of people are developing around function calls right now, the possibilities seem to be limitless. Can't wait to see where we go next!

CollapseExpand
 
stonediggity profile image
Liam Stone
I think AI and LLMs are pretty cool. Passionate about finding ways to democratise health and education.
  • Location
    QLD, Australia
  • Joined

Hey Dylan thanks for commenting. This functionality opens up so much potential with LLMs. Happy coding!

CollapseExpand
 
charliemday profile image
charliemday
Software Engineer building internet products, follow me on @cerwindcharlie
  • Location
    London
  • Joined

This is great, have you had any luck limiting the number of items in a JSON array using min/max item keys?

CollapseExpand
 
stonediggity profile image
Liam Stone
I think AI and LLMs are pretty cool. Passionate about finding ways to democratise health and education.
  • Location
    QLD, Australia
  • Joined

I haven't tried that just yet but will take a look!

CollapseExpand
 
petrbrzek profile image
Petr Brzek
  • Location
    Prague, Czech Republic
  • Joined

Hey, if anyone wants to play around with OpenAI function calls in UI playground, I created one - LangTale. Here's an example of the weather function.langtale.ai/playground/p/duxgbEYjnW

CollapseExpand
 
stonediggity profile image
Liam Stone
I think AI and LLMs are pretty cool. Passionate about finding ways to democratise health and education.
  • Location
    QLD, Australia
  • Joined

Thanks for sharing this!

CollapseExpand
 
catsarebetter profile image
Hide Shidara
I'm Hide. I'm passionate about profitable internet businesses. Currently – consultant and creator. My life's work is toexponentially increase the TGMV (total gross merchandise value) of online busines
  • Joined

This is also a really good guide on how to do this (for the super nerds :))

marcotm.com/articles/information-e...

Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment'spermalink.

For further actions, you may consider blocking this person and/orreporting abuse

I think AI and LLMs are pretty cool. Passionate about finding ways to democratise health and education.
  • Location
    QLD, Australia
  • Joined

More fromLiam Stone

DEV Community

We're a place where coders share, stay up-to-date and grow their careers.

Log in Create account

[8]ページ先頭

©2009-2025 Movatter.jp