Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.

License

NotificationsYou must be signed in to change notification settings

sigoden/llm-functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This project empowers you to effortlessly build powerful LLM tools and agents using familiar languages like Bash, JavaScript, and Python.

Forget complex integrations,harness the power offunction calling to connect your LLMs directly to custom code and unlock a world of possibilities. Execute system commands, process data, interact with APIs – the only limit is your imagination.

Tools Showcasellm-function-tool

Agents showcasellm-function-agent

Prerequisites

Make sure you have the following tools installed:

  • argc: A bash command-line framework and command runner
  • jq: A JSON processor

Getting Started withAIChat

Currently, AIChat is the only CLI tool that supportsllm-functions. We look forward to more tools supportingllm-functions.

1. Clone the repository

git clone https://github.com/sigoden/llm-functionscd llm-functions

2. Build tools and agents

I. Create a./tools.txt file with each tool filename on a new line.

get_current_weather.shexecute_command.sh#execute_py_code.py
Where is the web_search tool?

Theweb_search tool itself doesn't exist directly, Instead, you can choose from a variety of web search tools.

To use one as theweb_search tool, follow these steps:

  1. Choose a Tool: Available tools include:

    • web_search_cohere.sh
    • web_search_perplexity.sh
    • web_search_tavily.sh
    • web_search_vertexai.sh
  2. Link Your Choice: Use theargc command to link your chosen tool asweb_search. For example, to useweb_search_perplexity.sh:

    $ argc link-web-search web_search_perplexity.sh

    This command creates a symbolic link, makingweb_search.sh point to your selectedweb_search_perplexity.sh tool.

Now there is aweb_search.sh ready to be added to your./tools.txt.

II. Create a./agents.txt file with each agent name on a new line.

codertodo

III. Buildbin andfunctions.json

argc build

IV. Ensure that everything is ready (environment variables, Node/Python dependencies, mcp-bridge server)

argc check

3. Link LLM-functions and AIChat

AIChat expects LLM-functions to be placed in AIChat'sfunctions_dir so that AIChat can use the tools and agents that LLM-functions provides.

You can symlink this repository directory to AIChat'sfunctions_dir with:

ln -s"$(pwd)""$(aichat --info| sed -n's/^functions_dir\s\+//p')"# ORargc link-to-aichat

Alternatively, you can tell AIChat where the LLM-functions directory is by using an environment variable:

export AICHAT_FUNCTIONS_DIR="$(pwd)"

4. Start using the functions

Done! Now you can use the tools and agents with AIChat.

aichat --role %functions% what is the weatherin Paris?aichat --agent todo list all my todos

Writing Your Own Tools

Building tools for our platform is remarkably straightforward. You can leverage your existing programming knowledge, as tools are essentially just functions written in your preferred language.

LLM Functions automatically generates the JSON declarations for the tools based oncomments. Refer to./tools/demo_tool.{sh,js,py} for examples of how to use comments for autogeneration of declarations.

Bash

Create a new bashscript in the./tools/ directory (.e.g.execute_command.sh).

#!/usr/bin/env bashset -e# @describe Execute the shell command.# @option --command! The command to execute.main() {eval"$argc_command">>"$LLM_OUTPUT"}eval"$(argc --argc-eval"$0""$@")"

Javascript

Create a new javascript in the./tools/ directory (.e.g.execute_js_code.js).

/** * Execute the javascript code in node.js. *@typedef {Object} Args *@property {string} code - Javascript code to execute, such as `console.log("hello world")` *@param {Args} args */exports.run=function({ code}){eval(code);}

Python

Create a new python script in the./tools/ directory (e.g.execute_py_code.py).

defrun(code:str):"""Execute the python code.    Args:        code: Python code to execute, such as `print("hello world")`    """exec(code)

Writing Your Own Agents

Agent = Prompt + Tools (Function Calling) + Documents (RAG), which is equivalent to OpenAI's GPTs.

The agent has the following folder structure:

└── agents    └── myagent        ├── functions.json                  # JSON declarations for functions (Auto-generated)        ├── index.yaml                      # Agent definition        ├── tools.txt                       # Shared tools        └── tools.{sh,js,py}                # Agent tools

The agent definition file (index.yaml) defines crucial aspects of your agent:

name:TestAgentdescription:This is test agentversion:0.1.0instructions:You are a test ai agent to ...conversation_starters:  -What can you do?variables:  -name:foodescription:This is a foodocuments:  -local-file.txt  -local-dir/  -https://example.com/remote-file.txt

Refer to./agents/demo for examples of how to implement a agent.

MCP (Model Context Protocol)

  • mcp/server: Let LLM-Functions tools/agents be used through the Model Context Protocol.
  • mcp/bridge: Let external MCP tools be used by LLM-Functions.

Documents

License

The project is under the MIT License, Refer to theLICENSE file for detailed information.

About

Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.

Topics

Resources

License

Stars

Watchers

Forks

Contributors7


[8]ページ先頭

©2009-2025 Movatter.jp