- Notifications
You must be signed in to change notification settings - Fork162
Automatable GenAI Scripting
License
microsoft/genaiscript
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Programmatically assemble prompts for LLMs using JavaScript. Orchestrate LLMs, tools, and data in code.
JavaScript toolbox to work with prompts
Abstraction to make it easy and productive
Seamless Visual Studio Code integration or flexible command line
Built-in support for GitHub Copilot and GitHub Models, OpenAI, Azure OpenAI, Anthropic, and more
📄Read the ONLINE DOCUMENTATION atmicrosoft.github.io/genaiscript
📺 WatchMr. Maeda's Cozy AI Kitchen
📺 Watch aninterview on YouTube with nickyt
Say to you want to create an LLM script that generates a 'hello world' poem. You can write the following script:
$`Write a 'hello world' poem.`
The$
function is a template tag that creates a prompt. The prompt is then sent to the LLM (you configured), which generates the poem.
Let's make it more interesting by adding files, data and structured output. Say you want to include a file in the prompt, and then save the output in a file. You can write the following script:
// read filesconstfile=awaitworkspace.readText("data.txt")// include the file content in the prompt in a context-friendly waydef("DATA",file)// the task$`Analyze DATA and extract data in JSON in data.json.`
Thedef
function includes the content of the file, and optimizes it if necessary for the target LLM. GenAIScript script also parses the LLM outputand will extract thedata.json
file automatically.
Get started quickly by installing theVisual Studio Code Extension or using thecommand line.
Build prompts programmatically usingJavaScript orTypeScript.
def("FILE",env.files,{endsWith:".pdf"})$`Summarize FILE. Today is${newDate()}.`
Edit,Debug,Run, andTest your scripts inVisual Studio Code or with thecommand line.
Scripts arefiles! They can be versioned, shared, and forked.
// define the contextdef("FILE",env.files,{endsWith:".pdf"})// structure the dataconstschema=defSchema("DATA",{type:"array",items:{type:"string"}})// assign the task$`Analyze FILE and extract data to JSON using the${schema} schema.`
Define, validate, and repair data usingschemas. Zod support builtin.
constdata=defSchema("MY_DATA",{type:"array",items:{ ...}})$`Extract data from files using${data} schema.`
def("PDF",env.files,{endsWith:".pdf"})const{ pages}=awaitparsers.PDF(env.files[0])
Manipulate tabular data fromCSV,XLSX, ...
def("DATA",env.files,{endsWith:".csv",sliceHead:100})constrows=awaitparsers.CSV(env.files[0])defData("ROWS",rows,{sliceHead:100})
Extract files and diff from the LLM output. Preview changes in Refactoring UI.
$`Save the result in poem.txt.`
FILE ./poem.txtThe quick brown fox jumps over the lazy dog.
Grep or fuzz searchfiles.
const{ files}=awaitworkspace.grep(/[a-z][a-z0-9]+/,{globs:"*.md"})
Classify text, images or a mix of all.
constjoke=awaitclassify("Why did the chicken cross the roard? To fry in the sun.",{yes:"funny",no:"not funny",})
Register JavaScript functions astools(with fallback for models that don't support tools).Model Context Protocol (MCP) tools are also supported.
defTool("weather","query a weather web api",{location:"string"},async(args)=>awaitfetch(`https://weather.api.api/?location=${args.location}`))
Register JavaScript functions astools and combine tools + prompt into agents.
defAgent("git","Query a repository using Git to accomplish tasks.",`Your are a helpful LLM agent that can use the git tools to query the current repository. Answer the question in QUERY. - The current repository is the same as github repository.`,{ model,system:["system.github_info"],tools:["git"]})
then use it as a tool
script({tools:"agent_git"})$`Do a statistical analysis of the last commits`
See thegit agent source.
const{ files}=awaitretrieval.vectorSearch("cats","**/*.md")
Run models throughGitHub Models orGitHub Copilot.
script({ ...,model:"github:gpt-4o"})
Run your scripts withOpen Source models, likePhi-3, usingOllama,LocalAI.
script({ ...,model:"ollama:phi3"})
Let the LLM run code in a sandboxed execution environment.
script({tools:["python_code_interpreter"]})
Run code in Dockercontainers.
constc=awaithost.container({image:"python:alpine"})constres=awaitc.exec("python --version")
Transcribe and screenshot your videos so that you can feed them efficiently in your LLMs requests.
// transcribeconsttranscript=awaittranscript("path/to/audio.mp3")// screenshots at segmentsconstframes=awaitffmpeg.extractFrames("path_url_to_video",{ transcript})def("TRANSCRIPT",transcript)def("FRAMES",frames)
Run LLMs to build your LLM prompts.
for(constfileofenv.files){const{ text}=awaitrunPrompt((_)=>{_.def("FILE",file)_.$`Summarize the FILE.`})def("SUMMARY",text)}$`Summarize all the summaries.`
Run yourPrompty files as well!
---name:poem---Write me a poem
Scan your chats for secrets usingsecret scanning.
{"secretPatterns": {...,"OpenAI API Key":"sk-[A-Za-z0-9]{32,48}" }}
npx genaiscript run tlaplus-linter"*.tla"
import{run}from"genaiscript/api"constres=awaitrun("tlaplus-linter","*.tla")
GenAIScript provides built-in Responsible AI system prompts and Azure Content Safety supportsto validatecontent safety.
script({ ...,system:["system.safety_harmful_content", ...],contentSafety:"azure"// use azure content safety})constsafety=awaithost.contentSafety()constres=awaitsafety.detectPromptInjection(env.vars.input)
Integrate into yourPull Requests checks through comments, reviews, or description updates. Supports GitHub Actions and Azure DevOps pipelines.
npx genaiscript ... --pull-request-reviews
Build reliable prompts usingtests and evals powered bypromptfoo.
script({ ...,tests:{files:"penguins.csv",rubric:"is a data analysis report",facts:"The data refers about penguin population in Antarctica.",}})
The entire content of the docs is rendered athttps://microsoft.github.io/genaiscript/llms-full.txt in markdown. Feed it directly into your favorite RAG system.
If you are an LLM crawler, add the.md
suffix to any documentation URLs to get a raw markdown content. For example,https://microsoft.github.io/genaiscript/guides/prompt-as-code.md (note the .md extension)
We accept contributions! Checkout theCONTRIBUTING page for details and developer setup.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsofttrademarks or logos is subject to and must followMicrosoft's Trademark & Brand Guidelines.Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.Any use of third-party trademarks or logos are subject to those third-party's policies.
About
Automatable GenAI Scripting