- Notifications
You must be signed in to change notification settings - Fork2
OpenAI Function Call Schema Composer and Executor from OpenAPI (Swagger) Document.
License
wrtnlabs/openai-function-schema
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
As hard to manage duplicated repos, I've deprecated this project.
Instead, utililze below libraries instead.
OpenAI function call schema definition, converter and executor.
@wrtnio/openai-function-schema
supports OpenAI function call schema definitions, and converter from Swagger (OpenAPI) documents. About the converter from Swagger (OpenAPI) documents,@wrtnio/openai-function-schema
supports every versions of them.
- Swagger v2.0
- OpenAPI v3.0
- OpenApi v3.1
Also,@wrtnio/openai-function-schema
provides function call executor fromIOpenAiDocument
andIOpenAiFunction
, so that you can easily execute the remote Restful API operation with OpenAI composed arguments.
Let's learn how to use it by example code of below.
npm install @wrtnio/openai-function-schema
import{IOpenAiDocument,IOpenAiFunction,OpenAiComposer,OpenAiFetcher,}from"@wrtnio/openai-function-schema";importfsfrom"fs";importtypiafrom"typia";import{v4}from"uuid";import{IBbsArticle}from"../../../api/structures/IBbsArticle";constmain=async():Promise<void>=>{// COMPOSE OPENAI FUNCTION CALL SCHEMASconstswagger=JSON.parse(awaitfs.promises.readFile("swagger.json","utf8"),);constdocument:IOpenAiDocument=OpenAiComposer.document({ swagger});// EXECUTE OPENAI FUNCTION CALLconstfunc:IOpenAiFunction=document.functions.find((f)=>f.method==="put"&&f.path==="/bbs/articles",)!;constarticle:IBbsArticle=awaitOpenAiFetcher.execute({ document,function:func,connection:{host:"http://localhost:3000"},arguments:[// imagine that arguments are composed by OpenAIv4(),typia.random<IBbsArticle.ICreate>(),],});typia.assert(article);};main().catch(console.error);
About supported features, please read description comments of each component.
I'm preparing documentation and playground website of@wrtnio/openai-function-schema
features. Until that, please read below components' description comments. Even though you have to read source code of each component, but description comments of them may satisfy you.
- Schema Definitions
IOpenAiDocument
: OpenAI function metadata collection with optionsIOpenAiFunction
: OpenAI's function metadataIOpenAiSchema
: Type schema info escaped$ref
.
- Functions
OpenAiComposer
: ComposeIOpenAiDocument
from Swagger (OpenAPI) documentOpenAiFetcher
: Function call executor withIOpenAiFunction
OpenAiDataCombiner
: Data combiner for LLM function call with human composed dataOpenAiTypeChecker
: Type checker forIOpenAiSchema
######### LAUNCH CLI######### PRIOR TO NODE V20npm install -g @wrtnio/openai-function-schemanpx wofs# SINCE NODE V20npx @wrtnio/openai-function-schema######### PROMPT########-------------------------------------------------------- Swagger to OpenAI Function Call Schema Converter--------------------------------------------------------? Swagger file path: test/swagger.json? OpenAI Function Call Schema file path: test/plain.json? Whether to wrap parameters into an object with keyword or not: No
Convert swagger to OpenAI function schema file by a CLI command.
If you runnpx @wrtnio/openai-function-schema
(ornpx wofs
after global setup), the CLI (Command Line Interface) will inquiry those arguments. After you fill all of them, the OpenAI function call schema file ofIOpenAiDocument
type would be created to the target location.
If you want to specify arguments without prompting, you can fill them like below:
# PRIOR TO NODE V20npm install -g @wrtnio/openai-function-schemanpx wofs --input swagger.json --output openai.json --keywordfalse# SINCE NODE V20npx @wrtnio/openai-function-schema --input swagger.json --output openai.json --keywordfalse
Here is the list ofIOpenAiDocument
files generated by CLI command.
Project | Swagger | Positional | Keyworded |
---|---|---|---|
BBS | swagger.json | positional.json | keyworded.json |
Clickhouse | swagger.json | positional.json | keyworded.json |
Fireblocks | swagger.json | positional.json | keyworded.json |
Iamport | swagger.json | positional.json | keyworded.json |
PetStore | swagger.json | positional.json | keyworded.json |
Shopping Mall | swagger.json | positional.json | keyworded.json |
Toss Payments | swagger.json | positional.json | keyworded.json |
Uber | swagger.json | positional.json | keyworded.json |
If you want to utilize@wrtnio/openai-function-schema
in the API level, you should start from composingIOpenAiDocument
throughOpenAiComposer.document()
method.
After composing theIOpenAiDocument
data, you may provide the nestedIOpenAiFunction
instances to the OpenAI, and the OpenAI may compose the arguments by its function calling feature. With the OpenAI automatically composed arguments, you can execute the function call byOpenAiFetcher.execute()
method.
Here is the example code composing and executing theIOpenAiFunction
.
- Test Function:test_fetcher_positional_bbs_article_update.ts
- Backend Server Code:BbsArticlesController.ts
import{IOpenAiDocument,IOpenAiFunction,OpenAiComposer,OpenAiFetcher,}from"@wrtnio/openai-function-schema";importfsfrom"fs";importtypiafrom"typia";import{v4}from"uuid";import{IBbsArticle}from"../../../api/structures/IBbsArticle";constmain=async():Promise<void>=>{// COMPOSE OPENAI FUNCTION CALL SCHEMASconstswagger=JSON.parse(awaitfs.promises.readFile("swagger.json","utf8"),);constdocument:IOpenAiDocument=OpenAiComposer.document({ swagger});// EXECUTE OPENAI FUNCTION CALLconstfunc:IOpenAiFunction=document.functions.find((f)=>f.method==="put"&&f.path==="/bbs/articles",)!;constarticle:IBbsArticle=awaitOpenAiFetcher.execute({ document,function:func,connection:{host:"http://localhost:3000"},arguments:[// imagine that arguments are composed by OpenAIv4(),typia.random<IBbsArticle.ICreate>(),],});typia.assert(article);};main().catch(console.error);
By the way, above example code's target operation function has multiple parameters. You know what? If you configure a function to have only one parameter by wrapping into one object type, OpenAI function calling feature constructs arguments a little bit efficiently than multiple parameters case.
Such only one object typed parameter is calledkeyword parameter
, and@wrtnio/openai-function-schema
supports such keyword parameterized function schemas. When composingIOpenAiDocument
byOpenAiComposer.document()
method, configuresoption.keyword
to betrue
, then everyIOpenAiFunction
instances would be keyword parameterized. Also,OpenAiFetcher
understands the keyword parameterized function specification, so that performs proper execution by automatic decomposing the arguments.
Here is the example code of keyword parameterizing.
- Test Function:test_fetcher_keyword_bbs_article_update.ts
- Backend Server Code:BbsArticlesController.ts
import{IOpenAiDocument,IOpenAiFunction,OpenAiComposer,OpenAiFetcher,}from"@wrtnio/openai-function-schema";importfsfrom"fs";importtypiafrom"typia";import{v4}from"uuid";import{IBbsArticle}from"../../../api/structures/IBbsArticle";constmain=async():Promise<void>=>{// COMPOSE OPENAI FUNCTION CALL SCHEMASconstswagger=JSON.parse(awaitfs.promises.readFile("swagger.json","utf8"),);constdocument:IOpenAiDocument=OpenAiComposer.document({ swagger,options:{keyword:true,// keyword parameterizing}});// EXECUTE OPENAI FUNCTION CALLconstfunc:IOpenAiFunction=document.functions.find((f)=>f.method==="put"&&f.path==="/bbs/articles",)!;constarticle:IBbsArticle=awaitOpenAiFetcher.execute({ document,function:func,connection:{host:"http://localhost:3000"},arguments:[// imagine that argument is composed by OpenAI{id:v4(),body:typia.random<IBbsArticle.ICreate>(),},],});typia.assert(article);};main().catch(console.error);
At last, there can be some special API operation that some arguments must be composed by user, not by LLM (Large Language Model). For example, if an API operation requires file uploading or secret key identifier, it must be composed by user manually in the frontend application side.
For such case,@wrtnio/openai-function-schema
supports special optionIOpenAiDocument.IOptions.separate
. If you configure the callback function, it would be utilized for determining whether the value must be composed by user or not. When the arguments are composed by both user and LLM sides, you can combine them into one throughOpenAiDataComposer.parameters()
method, so that you can still execute the function calling withOpenAiFetcher.execute()
method.
Here is the example code of such special case:
- Test Function:test_combiner_keyword_parameters_query.ts
- Backend Server Code:MembershipController.ts
import{IOpenAiDocument,IOpenAiFunction,IOpenAiSchema,OpenAiComposer,OpenAiDataCombiner,OpenAiFetcher,OpenAiTypeChecker,}from"@wrtnio/openai-function-schema";importfsfrom"fs";importtypiafrom"typia";import{IMembership}from"../../api/structures/IMembership";constmain=async():Promise<void>=>{// COMPOSE OPENAI FUNCTION CALL SCHEMASconstswagger=JSON.parse(awaitfs.promises.readFile("swagger.json","utf8"),);constdocument:IOpenAiDocument=OpenAiComposer.document({ swagger,options:{keyword:true,separate:(schema:IOpenAiSchema)=>OpenAiTypeChecker.isString(schema)&&(schema["x-wrtn-secret-key"]!==undefined||schema["contentMediaType"]!==undefined),},});// EXECUTE OPENAI FUNCTION CALLconstfunc:IOpenAiFunction=document.functions.find((f)=>f.method==="patch"&&f.path==="/membership/change",)!;constmembership:IMembership=awaitOpenAiFetcher.execute({ document,function:func,connection:{host:"http://localhost:3000"},arguments:OpenAiDataCombiner.parameters({function:func,llm:[// imagine that below argument is composed by OpenAI{body:{name:"Wrtn Technologies",email:"master@wrtn.io",password:"1234",age:20,gender:1,},},],human:[// imagine that below argument is composed by human{query:{secret:"something",},body:{secretKey:"something",picture:"https://wrtn.io/logo.png",},},],}),});typia.assert(membership);};main().catch(console.error);
About
OpenAI Function Call Schema Composer and Executor from OpenAPI (Swagger) Document.