- Notifications
You must be signed in to change notification settings - Fork327
feat: add tutorial notebook for chainguard#78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
base:main
Are you sure you want to change the base?
Uh oh!
There was an error while loading.Please reload this page.
Conversation
ChainGuard is an open-source package that provides a simple, reliable way to secure Generative AI applications and agents powered by LangChain from prompt injection, jailbreaks, and other threats withLakera Guard.This tutorial notebook builds on top of the LangChain RAG Quickstart tutorial to illustrate how indirect prompt injection can happen and how ChainGuard can prevent indirect prompt injection in RAG applications.
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered byReviewNB |
c658a72
toad54331
Compare@@ -1,6 +1,7 @@ | |||
.vscode | |||
.idea/ | |||
.venv/ | |||
.env |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I added some notes about using a.env
file for the relevant environment variables, so I thought it was pertinent to add.env
to the.gitignore
, too.
@@ -0,0 +1,526 @@ | |||
{ |
merveenoyanApr 19, 2024 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Can you replace this with an open-source model instead? It can also be a hosted endpoint at Hugging Face for instance but with an open-source alternative.
Reply viaReviewNB
What does this PR do?
This PR introduces a new tutorial notebook that builds on top of theLangChain RAG Quickstart tutorial to illustrate howindirect prompt injection can happen in RAG applications, and howChainGuard can prevent indirect prompt injection in RAG applications.
ChainGuard is an open-source Python package that provides a simple, reliable way to secure Generative AI applications and agents powered by LangChain from prompt injection, jailbreaks, and other threats with Lakera Guard.
Who can review?
@MKhalusova