{"id":22883640,"url":"https://github.com/deadbits/prompt-serve","last_synced_at":"2025-05-07T05:50:52.005Z","repository":{"id":177658566,"uuid":"646228964","full_name":"deadbits/prompt-serve","owner":"deadbits","description":"Store and serve language model prompts","archived":false,"fork":false,"pushed_at":"2023-07-26T02:29:40.000Z","size":491,"stargazers_count":27,"open_issues_count":7,"forks_count":1,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-03-31T07:01:59.642Z","etag":null,"topics":["generative-ai","large-language-models","llm","prompt-engineering","prompt-server"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/deadbits.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-05-27T17:45:38.000Z","updated_at":"2025-03-19T11:45:15.000Z","dependencies_parsed_at":null,"dependency_job_id":"b5525235-ed3e-469a-8a76-c502ad5da77f","html_url":"https://github.com/deadbits/prompt-serve","commit_stats":null,"previous_names":["deadbits/prompt-serve"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/deadbits%2Fprompt-serve","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/deadbits%2Fprompt-serve/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/deadbits%2Fprompt-serve/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/deadbits%2Fprompt-serve/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/deadbits","download_url":"https://codeload.github.com/deadbits/prompt-serve/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252823693,"owners_count":21809709,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["generative-ai","large-language-models","llm","prompt-engineering","prompt-server"],"created_at":"2024-12-13T18:39:33.500Z","updated_at":"2025-05-07T05:50:51.997Z","avatar_url":"https://github.com/deadbits.png","language":"Python","readme":"# prompt-serve\n**store and serve language model prompts**\n\n## Overview 📖\n`prompt-serve` helps you manage all of your large language model (LLM) prompts and associated settings/metadata in a straightforward, version controlled manner. \n\nThis project provides a YAML schema for storing prompts in a structured manner and a small API server that handles interactions with a Git repository, so you can treat prompts more like re-usable code. \n\n* [Release blog post](https://deadbits.substack.com/p/the-prompt-serve-schema)\n\n## Highlights ✨\n* YAML schema for prompts and associated metadata\n* Associate prompts to one another to represent chains\n* Create \"packs\" of multiple prompts or chains to represent categories of tasks or workflows\n* Store any kind of prompt text or template\n* Store LLM provider, model, and settings\n* [Command-line utility](tools/contentctl.py) for common tasks\n  * initializing new Git repository\n  * creating prompt files\n  * viewing repo statistics\n  * convert prompts to [LangChain](https://github.com/hwcase17/langchain) [Prompt Templates](https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/)\n* Command-line utility for validating single prompt or directory against schema\n* Version controlled via Git\n* API server to upload or retrieve prompts (proof of concept / work in progress)\n\n## Schema 🗺️\nPrompts follow the schema provided in [schema.yml](schema.yml). \n\nCheck out the [prompts](prompts/) repository to see it in action. \n\n```yaml\ntitle: prompt-title-or-name\nuuid: prompt-uuid\ndescription: prompt-description\ncategory: prompt-category\nprovider: model-provider\nmodel: model-name\nmodel_settings:\n  temperature: 0.8\n  top_k: 40\n  top_p: 0.9\nprompt: prompt-text\ninput_variables:\n  - var1\n  - var2\nreferences:\n  - https://example.com\n  - https://example.com\nassociations:\n  - prompt_uuid\n  - prompt_uuid\npacks:\n  - pack-uuid\n  - pack-uuid\ntags:\n  - tag\n  - tag\n```\n\n## Validation ✅\nYou can use the [validate.py](/tools/validate.py) utility to verify prompts meet the schema and have unique UUIDs. \n\nBy specifying the `--create` argument, a new UUID will be provided if a given prompt isn't unique for your scanned set. You can also gather statistics on the types of prompts in your collection by passing `--gen-stats` (see the next section for example stats output).\n\n```\nusage: validate.py [-h] [-s SCHEMA] [-f FILE] [-d DIRECTORY] [-c] [-g]\n\nValidate YAML files against the prompt-serve schema.\n\noptions:\n  -h, --help            show this help message and exit\n  -s SCHEMA, --schema SCHEMA\n                        schema file to validate against\n  -f FILE, --file FILE  single file to validate\n  -d DIRECTORY, --directory DIRECTORY\n                        directory to validate\n  -c, --create          create new uuids if validation fails\n  -g, --gen-stats       generate statistics from directory\n\n```\n\n**Example output**\n  \n![Validation output example](/assets/validate.png)\n\n\n## Statistics utility 📊\nThe [content control tool](/tools/contentctl.py) can be used to scan a directory of prompt-serve repository and display statistics about all the prompts in the collection, including information on the category, provider, model, and tags.\n\nStats can also be optionally collected when running [validate.py](/tools/validate.py).\n\n**Example output**\n\n![Stats](/assets/stats.png)\n\n## Use in LangChain ⛓️\nprompt-serve files can be easily converted to LangChain Prompt Templates. \n\nThe [content control tool](/tools/contentctl.py) can convert individual prompt-serve files to langchain format. \n\n**Example output**\n\n![langchain conversion](/assets/convert.png)\n\n**Python**\n\n```python\nimport yaml\nfrom langhain import PromptTemplate\n\ndef convert(path_to_ps_prompt):    \n    with open(path_to_ps_prompt, 'r') as fp:\n        data = yaml.safe_load(fp)\n        prompt = data['prompt']\n            \n        if 'input_vars' in data.keys():\n            input_vars = data['input_vars']\n            langchain_template = PromptTemplate(template=prompt, input_variables=input_vars)\n        else:\n            langchain_template = PromptTemplate(template=prompt, input_variables=[])\n  \n        return langchain_template\n```\n\n## Prompt creation utility ✍️\nThe [content control tool](/tools/contentctl.py) can be used to interactively create a prompt with the prompt-serve schema. \n\n🪲 This is just a proof of concept and has a few known bugs. You would be better served creating these on your own for now.\n* multi-line input for \"prompt\" field not handled correctly\n* no defaults are set for optional fields\n\n```\n$ python create.py -n summary.yml\ncreating prompt file summary.yml ...\ntitle (str): Summarize blog posts\ndescription (str): Summarize a blog post with key takeaways\ncategory (str): summarization\nprovider (str) : openai\nmodel (str) : gpt-3.5-turbo\ntemperature (float) : 0.8\ntop_k (int) : \ntop_p (float) : 0.9\nmax_tokens (int) : 512\nstream (bool) : false\npresence_penalty (float) : \nfrequency_penalty (float) : \nprompt (str): Summarize the blog post provided below with 3-5 key takeaways as bullet points: {blog_content}\nreferences (seq) : https://github.com/deadbits/prompt-serve\nassociations (seq) : \npacks (seq) : \ntags (seq) : \n successfully wrote file summary.yml\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeadbits%2Fprompt-serve","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdeadbits%2Fprompt-serve","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeadbits%2Fprompt-serve/lists"}