Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/benderscript/promptguardian
All-in-one App that Checks LLM prompts for Injection, Data Leaks and Malicious URLs.
https://github.com/benderscript/promptguardian
attacks injection llm openai prompt
Last synced: 17 days ago
JSON representation
All-in-one App that Checks LLM prompts for Injection, Data Leaks and Malicious URLs.
- Host: GitHub
- URL: https://github.com/benderscript/promptguardian
- Owner: BenderScript
- License: apache-2.0
- Created: 2024-01-21T10:09:17.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-03-27T06:54:41.000Z (8 months ago)
- Last Synced: 2024-08-08T05:07:05.605Z (3 months ago)
- Topics: attacks, injection, llm, openai, prompt
- Language: Python
- Homepage:
- Size: 34.6 MB
- Stars: 3
- Watchers: 2
- Forks: 2
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Prompt Guardian
This is application that will check your prompt against a database of
malicious URLs and whether it is considered prompt injection attack.It uses OpenAI to check prompt injection attacks. Why? Because it's fun.
Seriously, it has very good guard rails to prevent you from doing dumb things. Therefore,
if you use another LLM, your prompt will be checked against OpenAI.Finally it checks your prompt for DLP (Data Loss Prevention). It uses an abstracted
API to check your prompt against a database.## OpenAI API Key
You should have an environment variable called `OPENAI_API_KEY` with your OpenAI API key. Alternatively,
you can create a `.env` file on the project root directory with the following content:```bash
OPENAI_API_KEY=
```## Installing Dependencies
```bash
pip3 install -r requirements.txt
```## Running
On the project root directory, run the following command:
```bash
uvicorn prompt_guardian.server:prompt_guardian_app --reload --port 9001
```If Everything goes well, you should see the following page at http://127.0.0.1:9001
![Landing page](images/landing.png)
## Testing
See the demo below where the App checks a prompt with a malicious URL and injection.
![Demo](images/prompt_guardian_demo.gif)