jailbreaks
Here are 5 public repositories matching this topic...
Language:All
Leaked GPTs Prompts Bypass the 25 message limit or to try out GPTs without a Plus subscription.
- Updated
Jan 17, 2024 - Python
Whistleblower is a offensive security tool for testing against system prompt leakage and capability discovery of an AI application exposed through API. Built for AI engineers, security researchers and folks who want to know what's going on inside the LLM-based app they use daily
- Updated
Jul 28, 2024 - Python
A prompt injection game to collect data for robust ML research
- Updated
Jan 27, 2025 - Python
MetaSC: Test-Time Safety Specification Optimization for Language Models
- Updated
Mar 6, 2025 - Python
Improve this page
Add a description, image, and links to thejailbreaks topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with thejailbreaks topic, visit your repo's landing page and select "manage topics."