- Notifications
You must be signed in to change notification settings - Fork1k
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
License
microsoft/promptflow
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Welcome to join us to make prompt flow better byparticipatingdiscussions,openingissues,submittingPRs.
Prompt flow is a suite of development tools designed to streamline the end-to-end development cycle of LLM-based AI applications, from ideation, prototyping, testing, evaluation to production deployment and monitoring. It makes prompt engineering much easier and enables you to build LLM apps with production quality.
With prompt flow, you will be able to:
- Create and iteratively develop flow
- Create executableflows that link LLMs, prompts, Python code and othertools together.
- Debug and iterate your flows, especiallytracing interaction with LLMs with ease.
- Evaluate flow quality and performance
- Evaluate your flow's quality and performance with larger datasets.
- Integrate the testing and evaluation into your CI/CD system to ensure quality of your flow.
- Streamlined development cycle for production
- Deploy your flow to the serving platform you choose or integrate into your app's code base easily.
- (Optional but highly recommended) Collaborate with your team by leveraging the cloud version ofPrompt flow in Azure AI.
To get started quickly, you can use a pre-built development environment.Click the button below to open the repo in GitHub Codespaces, and then continue the readme!
If you want to get started in your local environment, first install the packages:
Ensure you have a python environment,python>=3.9, <=3.11
is recommended.
pip install promptflow promptflow-tools
Create a chatbot with prompt flow
Run the command to initiate a prompt flow from a chat template, it creates folder namedmy_chatbot
and generates required files within it:
pf flow init --flow ./my_chatbot --type chat
Setup a connection for your API key
For OpenAI key, establish a connection by running the command, using theopenai.yaml
file in themy_chatbot
folder, which stores your OpenAI key (override keys and name with --set to avoid yaml file changes):
pf connection create --file ./my_chatbot/openai.yaml --set api_key=<your_api_key> --name open_ai_connection
For Azure OpenAI key, establish the connection by running the command, using theazure_openai.yaml
file:
pf connection create --file ./my_chatbot/azure_openai.yaml --set api_key=<your_api_key> api_base=<your_api_base> --name open_ai_connection
Chat with your flow
In themy_chatbot
folder, there's aflow.dag.yaml
file that outlines the flow, including inputs/outputs, nodes, connection, and the LLM model, etc
Note that in the
chat
node, we're using a connection namedopen_ai_connection
(specified inconnection
field) and thegpt-35-turbo
model (specified indeployment_name
field). The deployment_name filed is to specify the OpenAI model, or the Azure OpenAI deployment resource.
Interact with your chatbot by running: (pressCtrl + C
to end the session)
pf flowtest --flow ./my_chatbot --interactive
Core value: ensuring "High Quality” from prototype to production
Explore our15-minute tutorial that guides you through prompt tuning ➡ batch testing ➡ evaluation, all designed to ensure high quality ready for production.
Next Step! Continue with theTutorial 👇 section to delve deeper into prompt flow.
Prompt flow is a tool designed tobuild high quality LLM apps, the development process in prompt flow follows these steps: develop a flow, improve the flow quality, deploy the flow to production.
We also offer a VS Code extension (a flow designer) for an interactive flow development experience with UI.
You can install it from thevisualstudio marketplace.
Getting started with prompt flow: A step by step guidance to invoke your first flow run.
Tutorial: Chat with PDF: An end-to-end tutorial on how to build a high quality chat application with prompt flow, including flow development and evaluation with metrics.
More examples can be foundhere. We welcome contributions of new use cases!
If you're interested in contributing, please start with our dev setup guide:dev_setup.md.
Next Step! Continue with theContributing 👇 section to contribute to prompt flow.
This project welcomes contributions and suggestions. Most contributions require you to agree to aContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant usthe rights to use your contribution. For details, visithttps://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to providea CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructionsprovided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted theMicrosoft Open Source Code of Conduct.For more information see theCode of Conduct FAQ orcontactopencode@microsoft.com with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsofttrademarks or logos is subject to and must followMicrosoft's Trademark & Brand Guidelines.Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.Any use of third-party trademarks or logos are subject to those third-party's policies.
This project has adopted theMicrosoft Open Source Code of Conduct.For more information see theCode of Conduct FAQor contactopencode@microsoft.comwith any additional questions or comments.
The software may collect information about you and your use of the software andsend it to Microsoft if configured to enable telemetry.Microsoft may use this information to provide services and improve our products and services.You may turn on the telemetry as described in the repository.There are also some features in the software that may enable you and Microsoftto collect data from users of your applications. If you use these features, youmust comply with applicable law, including providing appropriate notices tousers of your applications together with a copy of Microsoft's privacystatement. Our privacy statement is located athttps://go.microsoft.com/fwlink/?LinkID=824704. You can learn more about datacollection and use in the help documentation and our privacy statement. Youruse of the software operates as your consent to these practices.
Telemetry collection is on by default.
To opt out, please runpf config set telemetry.enabled=false
to turn it off.
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under theMIT license.
About
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
Topics
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.