{"id":28469778,"url":"https://github.com/answerdotai/msglm","last_synced_at":"2025-07-18T20:36:38.098Z","repository":{"id":267837371,"uuid":"862340284","full_name":"AnswerDotAI/msglm","owner":"AnswerDotAI","description":"msglm makes it a little easier to create messages for language models like Claude and OpenAI GPTs.","archived":false,"fork":false,"pushed_at":"2025-06-30T01:58:41.000Z","size":613,"stargazers_count":10,"open_issues_count":1,"forks_count":5,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-07-11T06:53:16.172Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://answerdotai.github.io/msglm/","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/AnswerDotAI.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-09-24T12:45:10.000Z","updated_at":"2025-07-04T21:12:05.000Z","dependencies_parsed_at":"2024-12-12T19:03:09.149Z","dependency_job_id":"b4b3fe6a-c301-4c1f-bdc7-f1ace0e7c4ac","html_url":"https://github.com/AnswerDotAI/msglm","commit_stats":null,"previous_names":["answerdotai/msglm"],"tags_count":9,"template":false,"template_full_name":null,"purl":"pkg:github/AnswerDotAI/msglm","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnswerDotAI%2Fmsglm","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnswerDotAI%2Fmsglm/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnswerDotAI%2Fmsglm/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnswerDotAI%2Fmsglm/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/AnswerDotAI","download_url":"https://codeload.github.com/AnswerDotAI/msglm/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnswerDotAI%2Fmsglm/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265829169,"owners_count":23835090,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-06-07T09:08:37.630Z","updated_at":"2025-07-18T20:36:38.081Z","avatar_url":"https://github.com/AnswerDotAI.png","language":"Jupyter Notebook","readme":"# msglm\n\n\n\u003c!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! --\u003e\n\n### Installation\n\nInstall the latest version from pypi\n\n``` sh\n$ pip install msglm\n```\n\n## Usage\n\nTo use an LLM we need to structure our messages in a particular format.\n\nHere’s an example of a text chat from the OpenAI docs.\n\n``` python\nfrom openai import OpenAI\nclient = OpenAI()\n\ncompletion = client.chat.completions.create(\n  model=\"gpt-4o\",\n  messages=[\n    {\"role\": \"user\", \"content\": \"What's the Wild Atlantic Way?\"}\n  ]\n)\n```\n\nGenerating the correct format for a particular API can get tedious. The\ngoal of *msglm* is to make it easier.\n\nThe examples below will show you how to use *msglm* for text and image\nchats with OpenAI and Anthropic.\n\n### Text Chats\n\nFor a text chat simply pass a list of strings and the api format\n(e.g. “openai”) to **mk_msgs** and it will generate the correct format.\n\n``` python\nmk_msgs([\"Hello, world!\", \"some assistant response\"], api=\"openai\")\n```\n\n``` js\n[\n    {\"role\": \"user\", \"content\": \"Hello, world!\"},\n    {\"role\": \"assistant\", \"content\": \"Some assistant response\"}\n]\n```\n\n#### anthropic\n\n``` python\nfrom msglm import mk_msgs_anthropic as mk_msgs\nfrom anthropic import Anthropic\nclient = Anthropic()\n\nr = client.messages.create(\n    model=\"claude-3-haiku-20240307\",\n    max_tokens=1024,\n    messages=[mk_msgs([\"Hello, world!\", \"some LLM response\"])]\n)\nprint(r.content[0].text)\n```\n\n#### openai\n\n``` python\nfrom msglm import mk_msgs_openai as mk_msgs\nfrom openai import OpenAI\n\nclient = OpenAI()\nr = client.chat.completions.create(\n  model=\"gpt-4o-mini\",\n  messages=[mk_msgs([\"Hello, world!\", \"some LLM response\"])]\n)\nprint(r.choices[0].message.content)\n```\n\n### Image Chats\n\nFor an image chat simply pass the raw image bytes in a list with your\nquestion to *mk_msgs* and it will generate the correct format.\n\n``` python\nmk_msg([img, \"What's in this image?\"], api=\"anthropic\")\n```\n\n``` js\n[\n    {\n        \"role\": \"user\", \n        \"content\": [\n            {\"type\": \"image\", \"source\": {\"type\": \"base64\", \"media_type\": media_type, \"data\": img}}\n            {\"type\": \"text\", \"text\": \"What's in this image?\"}\n        ]\n    }\n]\n```\n\n#### anthropic\n\n``` python\nimport httpx\nfrom msglm import mk_msg_anthropic as mk_msg\nfrom anthropic import Anthropic\n\nclient = Anthropic()\n\nimg_url = \"https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg\"\nimg = httpx.get(img_url).content\n\nr = client.messages.create(\n    model=\"claude-3-haiku-20240307\",\n    max_tokens=1024,\n    messages=[mk_msg([img, \"Describe the image\"])]\n)\nprint(r.content[0].text)\n```\n\n#### openai\n\n``` python\nimport httpx\nfrom msglm import mk_msg_openai as mk_msg\nfrom openai import OpenAI\n\nimg_url = \"https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg\"\nimg = httpx.get(img_url).content\n\nclient = OpenAI()\nr = client.chat.completions.create(\n  model=\"gpt-4o-mini\",\n  messages=[mk_msg([img, \"Describe the image\"])]\n)\nprint(r.choices[0].message.content)\n```\n\n### API Wrappers\n\nTo make life a little easier, msglm comes with api specific wrappers for\n[`mk_msg`](https://AnswerDotAI.github.io/msglm/core.html#mk_msg) and\n[`mk_msgs`](https://AnswerDotAI.github.io/msglm/core.html#mk_msgs).\n\nFor Anthropic use\n\n``` python\nfrom msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs\n```\n\nFor OpenAI use\n\n``` python\nfrom msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs\n```\n\n### Other use-cases\n\n#### Prompt Caching\n\n*msglm* supports [prompt\ncaching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching)\nfor Anthropic models. Simply pass *cache=True* to *mk_msg* or *mk_msgs*.\n\n``` python\nfrom msglm import mk_msg_anthropic as mk_msg\n\nmk_msg(\"please cache my message\", cache=True)\n```\n\nThis generates the expected cache block below\n\n``` js\n{\n    \"role\": \"user\",\n    \"content\": [\n        {\"type\": \"text\", \"text\": \"Please cache my message\", \"cache_control\": {\"type\": \"ephemeral\"}}\n    ]\n}\n```\n\n#### PDF chats\n\n*msglm* offers PDF\n[support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)\nfor Anthropic. Just like an image chat all you need to do is pass the\nraw pdf bytes in a list with your question to *mk_msg* and it will\ngenerate the correct format as shown in the example below.\n\n``` python\nimport httpx\nfrom msglm import mk_msg_anthropic as mk_msg\nfrom anthropic import Anthropic\n\nclient = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})\n\nurl = \"https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf\"\npdf = httpx.get(url).content\n\nr = client.messages.create(\n    model=\"claude-3-5-sonnet-20241022\",\n    max_tokens=1024,\n    messages=[mk_msg([pdf, \"Which model has the highest human preference win rates across each use-case?\"])]\n)\nprint(r.content[0].text)\n```\n\nNote: this feature is currently in beta so you’ll need to:\n\n- use the Anthropic beta client\n  (e.g. `anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})`)\n- use the `claude-3-5-sonnet-20241022` model\n\n#### Citations\n\n*msglm* supports Anthropic\n[citations](https://docs.anthropic.com/en/docs/build-with-claude/citations).\nAll you need to do is pass the content of your document to *mk_ant_doc*\nand then pass the output to *mk_msg* along with your question as shown\nin the example below.\n\n``` python\nfrom msglm import mk_ant_doc, mk_msg_anthropic as mk_msg\nfrom anthropic import Anthropic\n\nclient = Anthropic()\n\ndoc = mk_ant_doc(\"The grass is green. The sky is blue.\", title=\"My Document\")\n\nr = client.messages.create(\n    model=\"claude-3-5-sonnet-20241022\",\n    max_tokens=1024,\n    messages=[mk_msg([doc, \"What color is the grass and sky?\"])]\n)\nfor o in r.content:\n    if c:=getattr(o, 'citations', None): print(f\"{o.text}. source: {c[0]['cited_text']} from  {c[0]['document_title']}\")\n    else: print(o.text)\n```\n\n*Note: The citations feature is currently available on Claude 3.5 Sonnet\n(new) and 3.5 Haiku.*\n\n### Summary\n\nWe hope *msglm* will make your life a little easier when chatting to\nLLMs. To learn more about the package please read this\n[doc](https://answerdotai.github.io/msglm/).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fanswerdotai%2Fmsglm","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fanswerdotai%2Fmsglm","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fanswerdotai%2Fmsglm/lists"}