https://github.com/lakeraai/lcguard
Guard your LangChain applications against prompt injection with Lakera ChainGuard.
https://github.com/lakeraai/lcguard
langchain langchain-python llm llm-security prompt-injection
Last synced: 6 months ago
JSON representation
Guard your LangChain applications against prompt injection with Lakera ChainGuard.
- Host: GitHub
- URL: https://github.com/lakeraai/lcguard
- Owner: lakeraai
- License: mit
- Created: 2024-01-04T15:13:30.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2025-07-04T15:32:07.000Z (6 months ago)
- Last Synced: 2025-07-04T16:47:17.133Z (6 months ago)
- Topics: langchain, langchain-python, llm, llm-security, prompt-injection
- Language: Python
- Homepage: https://lakeraai.github.io/lcguard
- Size: 1.36 MB
- Stars: 24
- Watchers: 4
- Forks: 5
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Security: SECURITY.md