{"id":18033613,"url":"https://github.com/rbren/go-prompter","last_synced_at":"2025-04-04T22:25:52.789Z","repository":{"id":223185316,"uuid":"759519119","full_name":"rbren/go-prompter","owner":"rbren","description":null,"archived":false,"fork":false,"pushed_at":"2024-03-11T21:33:53.000Z","size":57,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-02-10T07:14:52.506Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/rbren.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-02-18T20:05:03.000Z","updated_at":"2024-05-28T06:51:24.000Z","dependencies_parsed_at":"2024-02-18T21:27:57.875Z","dependency_job_id":"b18487dd-60a6-4ceb-8eaf-5231aa7b804d","html_url":"https://github.com/rbren/go-prompter","commit_stats":null,"previous_names":["rbren/go-prompter"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rbren%2Fgo-prompter","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rbren%2Fgo-prompter/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rbren%2Fgo-prompter/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rbren%2Fgo-prompter/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/rbren","download_url":"https://codeload.github.com/rbren/go-prompter/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247257509,"owners_count":20909489,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-10-30T11:07:55.422Z","updated_at":"2025-04-04T22:25:52.776Z","avatar_url":"https://github.com/rbren.png","language":"Go","readme":"# go-prompter\n\nThis is a set of utilities for prompting LLMs in Go.\n\nThis is a very early version of this library, and the API is likely to change.\n\n## Supported Backends\n* OpenAI's GPT\n* Anthropic's Claude\n* Google's Gemini\n* Hugging Face models (experimental)\n\n## Features\n* Prompt templating\n* Extract JSON, Markdown, and code from responses\n* Consistent interface for different models\n* Session management (currently only OpenAI and Claude)\n* Save prompts and responses to local files or S3 for debugging and analysis\n\n## Usage\n\n### Minimal Example\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/chat\"\n)\n\nfunc main() {\n    session := chat.NewSessionFromEnv()\n    resp, _ := session.Prompt(\"Who was the 44th president of the US?\")\n    fmt.Println(resp) // \"Barack Obama was the 44th president.\"\n}\n```\n\n### Select a Model\nYou can use env vars to choose a backend and supply credentials.\n```\nexport OPENAI_API_KEY=\"sk-...\"\nexport OPENAI_MODEL=\"gpt-4-0125-preview\"\n\nexport GEMINI_API_KEY=\"AI...\"\n\nexport CLAUDE_API_KEY=\"sk-ant-api03...\"\nexport CLAUDE_MODEL=\"claude-3-opus-20240229\"\nexport CLAUDE_VERSION=\"2023-06-01\"\n\nexport HUGGING_FACE_API_KEY=\"hf_...\"\nexport HUGGING_FACE_URL=\"https://api-inference.huggingface.co/models/codellama/CodeLlama-70b-Instruct-hf\"\n\nexport LLM_BACKEND=\"OPENAI\"\n```\n\nThe model will be automatically selected using the `LLM_BACKEND` env var when you run:\n```go\nsession := chat.NewSessionFromEnv()\n```\n\nYou can also instantiate a specific model directly:\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/llm\"\n    \"github.com/rbren/go-prompter/pkg/chat\"\n)\n\nfunc main() {\n    model := llm.NewOpenAIClient(apiKey, model)\n    session := chat.NewSessionWithLLM(model)\n}\n```\n\n### With Templates\nYou can use Go's text templating engine to create more powerful and dynamic prompts.\n\nNote that you can include one template file inside of another with `{{ template \"example\" . }}`.\nThis is helpful for e.g. adding a boilerplate preamble to all your prompts.\n\n##### prompts/polite.md\n```markdown\n# Task\nYour task is to respond to the user's query below. Please do so\nas politely as possible. You MUST always refer to the user as \"Sir or Madam\".\n\n## User Query\n{{ .user_query }}\n```\n\n##### main.go\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/chat\"\n)\n\n//go:embed prompts/*.md\nvar templateFS embed.FS\n\nfunc main() {\n    session := chat.NewSession()\n    session.SetFS(\u0026templateFS)\n    resp, err := session.PromptWithTemplate(\"polite\", map[string]any{\n        user_query: \"How tall is Barack Obama?\",\n    })\n    if err != nil {\n        panic(err)\n    }\n    fmt.Println(resp)\n}\n```\n\n### Extract JSON, Markdown, and Code\nYou can extract JSON, Markdown, and code from the response.\n\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/chat\"\n)\n\ntype Person struct {\n  Height int `json:\"height\"`\n  Age    int `json:\"age\"`\n}\n\nfunc main() {\n    session := chat.NewSession()\n    resp, _ := session.Prompt(\"Please tell me Obama's height in inches and age in years. Respond in JSON format.\")\n    p := Person{}\n    _ = chat.ExtractJSONAndUnmarshal(resp, \u0026p)\n\n    resp, _ := session.Prompt(\"Write a bash script that prints Obama's height and age.\")\n    code := chat.ExtractCode(resp)\n\n    resp, _ := session.Prompt(\"Write an essay in Markdown about Obama\")\n    title := chat.ExtractMarkdownTitle(resp)\n}\n```\n\n### Send Chat History as Context\nYou can optionally send the entire session history to the model as context.\nBe sure to start a new session when you want to clear the context, and don't\nshare sessions across users.\n\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/chat\"\n)\n\nfunc main() {\n    session := chat.NewSession()\n    session.MaxHistory = 4 // save 4 turns of conversation\n    resp, err := session.Prompt(\"Who was the 44th president of the US?\")\n    resp, err = session.Prompt(\"How tall is he?\")\n}\n```\n\n### Save Debug Prompts and Responses\nYou can save prompts and responses to local files or S3 for debugging and analysis.\n\n```go\npackage main\n\nimport (\n    \"github.com/rbren/go-prompter/pkg/chat\"\n    \"github.com/rbren/go-prompter/pkg/files\"\n)\n\nfunc main() {\n    session := chat.NewSession()\n    session.SetDebugFileManager(files.LocalFileManager{\n      BasePath: \"./debug/\",\n    })\n    session.SessionID = \"presidents\" // this will be a random UUID otherwise\n    resp, err := session.PromptWithID(\"44\", \"Who was the 44th president of the US?\")\n    // created:\n    //   ./debug/presidents/44/prompt.md\n    //   ./debug/presidents/44/response.md\n}\n```\n\n\n# Example Projects\n* https://github.com/rbren/vizzy\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frbren%2Fgo-prompter","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Frbren%2Fgo-prompter","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frbren%2Fgo-prompter/lists"}