https://github.com/shannonlal/llm-context-compressor
This is LLM Compress Prompt Tool for reducing the size of prompts for llm
https://github.com/shannonlal/llm-context-compressor
Last synced: 2 months ago
JSON representation
This is LLM Compress Prompt Tool for reducing the size of prompts for llm
- Host: GitHub
- URL: https://github.com/shannonlal/llm-context-compressor
- Owner: shannonlal
- License: apache-2.0
- Created: 2024-04-13T15:58:01.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-28T14:28:24.000Z (about 1 year ago)
- Last Synced: 2024-10-29T11:58:24.256Z (7 months ago)
- Language: Python
- Size: 6.74 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# llm-compress-prompt
This is LLM Compress Prompt Tool for reducing the size of prompts for llmThe following Library is based upon Microsoft's LLMLinqua (https://github.com/microsoft/LLMLingua) prompt compression technique. LLMLingua is a highly performant prompt compression technique; however, it is designed for organizations that host their models. LLM Compress Prompt is a library that provides similar prompt compression; however, it is designed to not run on a GPU and instead uses third party LLMs to support the compression technique.