{"id":13584976,"url":"https://github.com/snowby666/poe-api-wrapper","last_synced_at":"2025-03-13T09:07:27.507Z","repository":{"id":187998312,"uuid":"677933462","full_name":"snowby666/poe-api-wrapper","owner":"snowby666","description":"👾 A Python API wrapper for Poe.com. With this, you will have free access to GPT-4, Claude, Llama, Gemini, Mistral and more! 🚀","archived":false,"fork":false,"pushed_at":"2025-02-22T11:07:34.000Z","size":2904,"stargazers_count":1052,"open_issues_count":31,"forks_count":132,"subscribers_count":25,"default_branch":"main","last_synced_at":"2025-03-06T08:07:29.817Z","etag":null,"topics":["api","chatbot","chatgpt","claude","code-llama","dall-e","gemini","gpt-4","groq","llama","mistral","openai","palm2","poe","poe-api","python","quora","qwen","reverse-engineering","stable-diffusion"],"latest_commit_sha":null,"homepage":"https://pypi.org/project/poe-api-wrapper/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/snowby666.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-08-13T06:01:10.000Z","updated_at":"2025-03-05T16:18:46.000Z","dependencies_parsed_at":"2023-12-28T05:28:31.939Z","dependency_job_id":"72488abd-792d-4843-9e0a-3845970427c4","html_url":"https://github.com/snowby666/poe-api-wrapper","commit_stats":{"total_commits":188,"total_committers":11,"mean_commits":17.09090909090909,"dds":"0.12234042553191493","last_synced_commit":"420b75a7f7696ccf5c2e270e7b4584c249905284"},"previous_names":["snowby666/poe-api-wrapper"],"tags_count":32,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/snowby666%2Fpoe-api-wrapper","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/snowby666%2Fpoe-api-wrapper/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/snowby666%2Fpoe-api-wrapper/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/snowby666%2Fpoe-api-wrapper/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/snowby666","download_url":"https://codeload.github.com/snowby666/poe-api-wrapper/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243371570,"owners_count":20280538,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["api","chatbot","chatgpt","claude","code-llama","dall-e","gemini","gpt-4","groq","llama","mistral","openai","palm2","poe","poe-api","python","quora","qwen","reverse-engineering","stable-diffusion"],"created_at":"2024-08-01T15:04:38.332Z","updated_at":"2025-03-13T09:07:27.496Z","avatar_url":"https://github.com/snowby666.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\u003ca href=\"https://github.com/snowby666\"\u003e\n\u003cimg src=\"https://socialify.git.ci/snowby666/poe-api-wrapper/image?font=Raleway\u0026forks=1\u0026issues=1\u0026language=1\u0026logo=https://i.ibb.co/xHrZxFY/logo-nobg.png\u0026name=1\u0026owner=1\u0026pattern=Charlie%20Brown\u0026pulls=1\u0026stargazers=1\u0026theme=Auto\" width=\"700\" height=\"350\"\u003e\u003c/a\u003e\n\n\u003ch1\u003ePoe API Wrapper \u003cimg src=\"https://psc2.cf2.poecdn.net/favicon.svg\" height=\"35\"\u003e\u003c/h1\u003e\n\n\u003cp\u003e\u003cem\u003eA simple, lightweight and efficient API wrapper for Poe.com\u003c/em\u003e\u003c/p\u003e\n\u003c/div\u003e\n\n\u003cp align=\"center\"\u003e\n\u003ca href=\"https://pypi.org/project/poe-api-wrapper/\"\u003e\u003cimg src=\"https://img.shields.io/pypi/v/poe-api-wrapper\"\u003e\u003c/a\u003e\n\u003cimg alt=\"Python Version\" src=\"https://img.shields.io/badge/python-3.7+-blue.svg\" alt=\"python\"\u003e\n\u003ca href=\"https://www.pepy.tech/projects/poe-api-wrapper\"\u003e\n\u003cimg alt=\"PyPI - Downloads\" src=\"https://static.pepy.tech/badge/poe-api-wrapper\"\u003e\u003c/a\u003e\n\u003ca href=\"https://discord.gg/apUUqbxCBQ\"\u003e\n\u003cimg alt=\"Support Server\" src=\"https://dcbadge.limes.pink/api/server/https://discord.com/invite/apUUqbxCBQ?style=flat\"\u003e\u003c/a\u003e\n\u003cbr\u003e\n\u003c/p\u003e\n\n## 📚 Table of Contents\n- [📚 Table of Contents](#-table-of-contents)\n- [✨ Highlights](#-highlights)\n- [🔧 Installation](#-installation)\n- [🦄 Documentation](#-documentation)\n  - [Available Default Bots](#available-default-bots)\n  - [How to get your Token](#how-to-get-your-token)\n    - [Getting p-b and p-lat cookies (*required*)](#getting-p-b-and-p-lat-cookies-required)\n    - [Getting formkey (*optional*)](#getting-formkey-optional)\n  - [OpenAI](#openai)\n    - [Available Routes](#available-routes)\n    - [Quick Setup](#quick-setup)\n    - [Built-in completion (WIP)](#built-in-completion-wip)\n    - [OpenAI Proxy Server](#openai-proxy-server)\n      - [Chat](#chat)\n      - [Images](#images)\n      - [Models](#models)\n  - [Basic Usage](#basic-usage)\n  - [Bots Group Chat](#bots-group-chat)\n  - [Misc](#misc)\n    - [Text files](#text-files)\n    - [Media files](#media-files)\n- [🙌 Contributing](#-contributing)\n  - [Run debug](#run-debug)\n  - [Ways to contribute](#ways-to-contribute)\n  - [Contributors](#contributors)\n- [🤝 Copyright](#-copyright)\n  - [Copyright Notice](#copyright-notice)\n\n## ✨ Highlights\n\u003cdetails close\u003e\n\u003csummary\u003eSupport both \u003cb\u003eSync\u003c/b\u003e and \u003cb\u003eAsync\u003c/b\u003e\u003c/summary\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eAuthentication\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eLog in with your Poe tokens\u003c/li\u003e\n\u003cli\u003eAuto Proxy requests\u003c/li\u003e\n\u003cli\u003eSpecify Proxy context\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eMessage Automation\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eCreate new chat thread\u003c/li\u003e\n\u003cli\u003eSend messages\u003c/li\u003e\n\u003cli\u003eStream bot responses\u003c/li\u003e\n\u003cli\u003eSend concurrent messages\u003c/li\u003e\n\u003cli\u003eRetry the last message\u003c/li\u003e\n\u003cli\u003eSupport file attachments\u003c/li\u003e\n\u003cli\u003eRetrieve suggested replies\u003c/li\u003e\n\u003cli\u003eStop message generation\u003c/li\u003e\n\u003cli\u003eDelete chat threads\u003c/li\u003e\n\u003cli\u003eClear conversation context\u003c/li\u003e\n\u003cli\u003ePurge messages of 1 bot\u003c/li\u003e\n\u003cli\u003ePurge all messages of user\u003c/li\u003e\n\u003cli\u003eFetch previous messages\u003c/li\u003e\n\u003cli\u003eShare and import messages\u003c/li\u003e\n\u003cli\u003eGet citations\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eChat Management\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eGet Chat Ids \u0026 Chat Codes of bot(s)\u003c/li\u003e\n\u003cli\u003eGet subscription info and remaining points\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eBot Management\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eGet bot info\u003c/li\u003e\n\u003cli\u003eGet available creation models\u003c/li\u003e\n\u003cli\u003eCreate custom bot\u003c/li\u003e\n\u003cli\u003eEdit custom bot\u003c/li\u003e\n\u003cli\u003eDelete a custom bot\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eKnowledge Base Customization\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eGet available knowledge bases\u003c/li\u003e\n\u003cli\u003eUpload knowledge bases for custom bots\u003c/li\u003e\n\u003cli\u003eEdit knowledge bases for custom bots\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eDiscovery\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eGet available bots\u003c/li\u003e\n\u003cli\u003eGet a user's bots\u003c/li\u003e\n\u003cli\u003eGet available categories\u003c/li\u003e\n\u003cli\u003eExplore 3rd party bots and users\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\u003cdetails close\u003e\n\u003csummary\u003eBots Group Chat \u003cb\u003e(Beta)\u003c/b\u003e\u003c/summary\u003e\u003cbr\u003e\n\u003cul\u003e\n\u003cli\u003eCreate a group chat\u003c/li\u003e\n\u003cli\u003eDelete a group chat\u003c/li\u003e\n\u003cli\u003eGet created groups\u003c/li\u003e\n\u003cli\u003eGet group data\u003c/li\u003e\n\u003cli\u003eSave group chat history\u003c/li\u003e\n\u003cli\u003eLoad group chat history\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/details\u003e\n\n## 🔧 Installation\n- First, install this library with the following command:\n```ShellSession\npip install -U poe-api-wrapper\n```\nOr you can install auto-proxy version of this library for **Python 3.9+**\n```ShellSession\npip install -U 'poe-api-wrapper[proxy]'\n```\nQuick setup for Async Client:\n```py\nfrom poe_api_wrapper import AsyncPoeApi\nimport asyncio\ntokens = {\n    'p-b': ..., \n    'p-lat': ...,\n}\n\nasync def main():\n    client = await AsyncPoeApi(tokens=tokens).create()\n    message = \"Explain quantum computing in simple terms\"\n    async for chunk in client.send_message(bot=\"gpt3_5\", message=message):\n        print(chunk[\"response\"], end='', flush=True)\n        \nasyncio.run(main())\n```\n- You can run an example of this library:\n```py\nfrom poe_api_wrapper import PoeExample\ntokens = {\n    'p-b': ..., \n    'p-lat': ...,\n}\nPoeExample(tokens=tokens).chat_with_bot()\n```\n- This library also supports command-line interface:\n```ShellSession\npoe -b P-B_HERE -lat P-LAT_HERE -f FORMKEY_HERE\n```\n\u003e [!TIP]\n\u003e Type `poe -h` for more info\n\n\u003cimg src=\"https://i.imgur.com/oAkTHfB.png\" width=\"100%\" height=\"auto\"\u003e\n\n## 🦄 Documentation\n### Available Default Bots\n| Display Name            | Model                     | Token Limit | Words | Access Type                                                     |\n| ----------------------- | ------------------------- | ----------- | ----- | --------------------------------------------------------------- |\n| Assistant               | capybara                  | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-3.5-Sonnet       | claude_3_igloo            | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-3-Opus           | claude_2_1_cedar          | 4K          | 3K    | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| Claude-3-Sonnet         | claude_2_1_bamboo         | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-3-Haiku          | claude_3_haiku            | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-3.5-Sonnet-200k  | claude_3_igloo_200k       | 200K        | 150K  | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-3-Opus-200k      | claude_3_opus_200k        | 200K        | 150K  | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| Claude-3-Sonnet-200k    | claude_3_sonnet_200k      | 200K        | 150K  | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| Claude-3-Haiku-200k     | claude_3_haiku_200k       | 200K        | 150K  | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-2                | claude_2_short            | 4K          | 3K    | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| Claude-2-100k           | a2_2                      | 100K        | 75K   | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| Claude-instant          | a2                        | 9K          | 7K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Claude-instant-100k     | a2_100k                   | 100K        | 75K   | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| GPT-3.5-Turbo           | chinchilla                | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| GPT-3.5-Turbo-Raw       | gpt3_5                    | 2k          | 1.5K  | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| GPT-3.5-Turbo-Instruct  | chinchilla_instruct       | 2K          | 1.5K  | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| ChatGPT-16k             | agouti                    | 16K         | 12K   | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| GPT-4-Classic           | gpt4_classic              | 2K          | 1.5K  | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| GPT-4-Turbo             | beaver                    | 4K          | 3K    | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| GPT-4-Turbo-128k        | vizcacha                  | 128K        | 96K   | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| GPT-4o                  | gpt4_o                    | 4k          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| GPT-4o-128k             | gpt4_o_128k               | 128K        | 96K   | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n| GPT-4o-Mini             | gpt4_o_mini               | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| GPT-4o-Mini-128k        | gpt4_o_mini_128k          | 128K        | 96K    | ![Free](https://img.shields.io/badge/free-2feb7a)              |\n| Google-PaLM             | acouchy                   | 8K          | 6K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Code-Llama-13b          | code_llama_13b_instruct   | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Code-Llama-34b          | code_llama_34b_instruct   | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Solar-Mini              | upstage_solar_0_70b_16bit | 2K          | 1.5K  | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Gemini-1.5-Flash-Search | gemini_pro_search         | 4K          | 3K    | ![Free](https://img.shields.io/badge/free-2feb7a)               |\n| Gemini-1.5-Pro-2M       | gemini_1_5_pro_1m         | 2M          | 1.5M  | ![Subscriber](https://img.shields.io/badge/subscriber-fc4747)   |\n\u003e [!IMPORTANT]  \n\u003e The data on token limits and word counts listed above are approximate and may not be entirely accurate, as the pre-prompt engineering process of poe.com is private and not publicly disclosed. \n\u003e\n\u003e The table above only shows bots with different display names from their models. Other bots on poe.com have the same display name as model.\n\n### How to get your Token\n\n#### Getting p-b and p-lat cookies (*required*)\nSign in at https://poe.com/\n\nF12 for Devtools (Right-click + Inspect)\n- Chromium: Devtools \u003e Application \u003e Cookies \u003e poe.com\n- Firefox: Devtools \u003e Storage \u003e Cookies\n- Safari: Devtools \u003e Storage \u003e Cookies\n\nCopy the values of `p-b` and `p-lat` cookies\n\n#### Getting formkey (*optional*)\n\u003e [!IMPORTANT] \n\u003e By default, **poe-api-wrapper** will automatically retrieve formkey for you. If it doesn't work, please pass this token manually by following these steps:\n\nThere are two ways to get formkey:\n\nF12 for Devtools (Right-click + Inspect)\n\n- 1st Method: Devtools \u003e Network \u003e gql_POST \u003e Headers \u003e Poe-Formkey\n\n    Copy the value of `Poe-Formkey`\n\n- 2nd Method: Devtools \u003e Console \u003e Type: `allow pasting` \u003e Paste this script: `window.ereNdsRqhp2Rd3LEW()`\n\n    Copy the result\n\n### OpenAI\n\u003cdetails close\u003e\n\u003csummary\u003eRead Docs\u003c/summary\u003e\n\n#### Available Routes\n\n- /models\n- /chat/completions\n- /images/generations\n- /images/edits\n- /v1/models\n- /v1/chat/completions\n- /v1/images/generations\n- /v1/images/edits\n\n#### Quick Setup\n- First, install the additional packages:\n```ShellSession\npip install -U 'poe-api-wrapper[llm]'\n```\n- Clone the repo or use the same setup in `openai` folder:\n```ShellSession\ngit clone https://github.com/snowby666/poe-api-wrapper.git\ncd poe-api-wrapper\\poe_api_wrapper\\openai\n```\n- Modify secrets.json with your own tokens\n  \n- Run the FastAPI server:\n```ShellSession\npython api.py\n```\n- Run the examples:\n```ShellSession\npython example.py\n```\n\n#### Built-in completion (WIP)\n\n#### OpenAI Proxy Server\n- Start the server\n```py\nfrom poe_api_wrapper import PoeServer\ntokens = [\n    {\"p-b\": \"XXXXXXXX\", \"p-lat\": \"XXXXXXXX\"},\n    {\"p-b\": \"XXXXXXXX\", \"p-lat\": \"XXXXXXXX\"},\n    {\"p-b\": \"XXXXXXXX\", \"p-lat\": \"XXXXXXXX\"}\n]\nPoeServer(tokens=tokens)\n\n# You can also specify address and port (default is 127.0.0.1:8000)\nPoeServer(tokens=tokens, address=\"0.0.0.0\", port=\"8080\")\n```\n\n##### Chat\n- Non-streamed example:\n```py\nimport openai \nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nresponse = client.chat.completions.create(\n    model=\"gpt-3.5-turbo\", \n    messages = [\n                {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n                {\"role\": \"user\", \"content\": \"Hello!\"}\n            ]\n)\n\nprint(response.choices[0].message.content)\n```\n- Streaming example:\n```py\nimport openai \nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nstream = client.chat.completions.create(\n    model=\"gpt-3.5-turbo\", \n    messages = [\n                {\"role\": \"user\", \"content\": \"this is a test request, write a short poem\"}\n            ],\n    stream=True\n)\n\nfor chunk in stream:\n    print(chunk.choices[0].delta.content or \"\", end=\"\", flush=True)\n\n# Set max_tokens\nstream_2 = client.chat.completions.create(\n    model=\"claude-instant\", \n    messages = [\n                {\"role\": \"user\", \"content\": \"Can you tell me about the creation of blackholes?\"}\n            ],\n    stream=True,\n    max_tokens=20, # if max_tokens reached, finish_reason will be 'length'\n)\n\nfor chunk in stream_2:\n    print(chunk.choices[0].delta.content or \"\", end=\"\", flush=True)\n\n# Include usage \nstream_3 = client.chat.completions.create(\n    model=\"claude-instant\", \n    messages = [\n                {\"role\": \"user\", \"content\": \"Write a 100-character meta description for my blog post about llamas\"}\n            ],\n    stream=True,\n    max_tokens=4096,\n    stream_options={\n\t\t\"include_usage\": True # last chunk contains prompts_tokens, completion_tokens and total_tokens\n\t}\n)\n\nfor chunk in stream_3:\n    print(chunk, end=\"\\n\\n\", flush=True)\n```\n- Image input example:\n```py\nimport openai \nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\n# Legacy style (https://platform.openai.com/docs/api-reference/chat/create)\nresponse = client.chat.completions.create(\n    model=\"claude-3.5-sonnet\",\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": [\n                {\"type\": \"text\", \"text\": \"What's in this image?\"},\n                {\n                    \"type\": \"image_url\",\n                    \"image_url\": \"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n                }\n            ],\n        }\n    ]\n)\n\n# New style (https://platform.openai.com/docs/guides/vision)\nresponse = client.chat.completions.create(\n    model=\"claude-3.5-sonnet\",\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": [\n                {\"type\": \"text\", \"text\": \"What's in this image?\"},\n                {\n                    \"type\": \"image_url\",\n                    \"image_url\": {\n                        \"url\": \"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\"\n                    }\n                }\n            ],\n        }\n    ]\n)\n\n# Multiple images\nresponse = client.chat.completions.create(\n    model=\"gpt-4o\",\n    messages=[\n    {\n      \"role\": \"user\",\n      \"content\": [\n        {\n          \"type\": \"text\",\n          \"text\": \"What are in these images? Is there any difference between them?\",\n        },\n        {\n          \"type\": \"image_url\",\n          \"image_url\": {\n            \"url\": \"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n          },\n        },\n        {\n          \"type\": \"image_url\",\n          \"image_url\": {\n            \"url\": \"https://imgcdn.stablediffusionweb.com/2024/4/29/0b0b8798-1965-4e3d-b0a8-d153728320d4.jpg\",\n          }\n        }\n      ]\n    }\n  ]\n)\n\n# Base64 image\nimport base64\n\n# Function to encode the image\ndef encode_image(image_path):\n  with open(image_path, \"rb\") as image_file:\n    return base64.b64encode(image_file.read()).decode('utf-8')\n\n# Path to your image\nimage_path = \"path_to_your_image.jpg\"\n\n# Getting the base64 string\nbase64_image = encode_image(image_path)\n\nresponse = client.chat.completions.create(\n    model=\"gpt-4o\",\n    messages=[\n    {\n      \"role\": \"user\",\n      \"content\": [\n        {\n          \"type\": \"text\",\n          \"text\": \"What’s in this image?\"\n        },\n        {\n          \"type\": \"image_url\",\n          \"image_url\": {\n            \"url\": f\"data:image/jpeg;base64,{base64_image}\"\n          }\n        }\n      ]\n    }\n  ]\n)\n\nprint(response.choices[0].message.content)\n```\n- Function calling example:\n```py\nimport openai, json\nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nTEST_MODEL = \"gpt-4o-mini\"\n\n# Example dummy function hard coded to return the same weather\n# In production, this could be your backend API or an external API\ndef get_current_temperature(location, unit=\"fahrenheit\"):\n    \"\"\"Get the current weather in a given location\"\"\"\n    if \"tokyo\" in location.lower():\n        return json.dumps({\"location\": \"Tokyo\", \"temperature\": \"10\", \"unit\": unit})\n    elif \"san francisco\" in location.lower():\n        return json.dumps({\"location\": \"San Francisco\", \"temperature\": \"72\", \"unit\": unit})\n    elif \"paris\" in location.lower():\n        return json.dumps({\"location\": \"Paris\", \"temperature\": \"22\", \"unit\": unit})\n    else:\n        return json.dumps({\"location\": location, \"temperature\": \"unknown\"})\n    \ndef get_rain_probability(location):\n    \"\"\"Get the probability of rain in a given location\"\"\"\n    if \"tokyo\" in location.lower():\n        return json.dumps({\"location\": \"Tokyo\", \"rain_probability\": \"10%\"})\n    elif \"san francisco\" in location.lower():\n        return json.dumps({\"location\": \"San Francisco\", \"rain_probability\": \"20%\"})\n    elif \"paris\" in location.lower():\n        return json.dumps({\"location\": \"Paris\", \"rain_probability\": \"30%\"})\n    else:\n        return json.dumps({\"location\": location, \"rain_probability\": \"unknown\"})\n    \ndef run_conversation():\n    # Step 1: send the conversation and available functions to the model\n    messages = [\n        {'role': 'user', 'content': \"Hello there. What the weather like in Tokyo?\"},\n        {'role': 'assistant', 'content': \"Let me check the weather for you.\"},\n        {'role': 'user', 'content': \"What is the chance of raining in paris? Can you also tell me the temperature in Tokyo and LA?\"},\n                ]\n    tools = [\n    {\n      \"type\": \"function\",\n      \"function\": {\n        \"name\": \"get_current_temperature\",\n        \"description\": \"Get the current temperature for a specific location\",\n        \"parameters\": {\n          \"type\": \"object\",\n          \"properties\": {\n            \"location\": {\n              \"type\": \"string\",\n              \"description\": \"The city and state, e.g., San Francisco, CA\"\n            },\n            \"unit\": {\n              \"type\": \"string\",\n              \"enum\": [\"Celsius\", \"Fahrenheit\"],\n              \"description\": \"The temperature unit to use. Infer this from the user's location.\"\n            }\n          },\n          \"required\": [\"location\", \"unit\"]\n        }\n      }\n    },\n    {\n      \"type\": \"function\",\n      \"function\": {\n        \"name\": \"get_rain_probability\",\n        \"description\": \"Get the probability of rain for a specific location\",\n        \"parameters\": {\n          \"type\": \"object\",\n          \"properties\": {\n            \"location\": {\n              \"type\": \"string\",\n              \"description\": \"The city and state, e.g., San Francisco, CA\"\n            }\n          },\n          \"required\": [\"location\"]\n        }\n      }\n    }\n  ]\n    response = client.chat.completions.create(\n        model=TEST_MODEL,\n        messages=messages,\n        tools=tools,\n        tool_choice={\"type\": \"function\", \"function\": {\"name\": \"get_current_temperature\"}},\n    )\n    response_message = response.choices[0].message\n    print(\"\\n\", response_message, \"\\n\")\n        \n    tool_calls = response_message.tool_calls\n    # Step 2: check if the model wanted to call a function\n    if tool_calls:\n        # Step 3: call the function\n        # Note: the JSON response may not always be valid; be sure to handle errors\n        available_functions = {\n            \"get_current_temperature\": get_current_temperature,\n            \"get_rain_probability\": get_rain_probability\n        }  # only two functions in this example, but you can have multiple\n        messages.append(response_message)  # extend conversation with assistant's reply\n        # Step 4: send the info for each function call and function response to the model\n        for tool_call in tool_calls:\n            print(tool_call, \"\\n\")\n            function_name = tool_call.function.name\n            function_to_call = available_functions[function_name]\n            function_args = json.loads(tool_call.function.arguments)\n            function_response = function_to_call(**function_args)\n            messages.append(\n                {\n                    \"tool_call_id\": tool_call.id,\n                    \"role\": \"tool\",\n                    \"name\": function_name,\n                    \"content\": function_response,\n                }\n            )  # extend conversation with function response\n    second_response = client.chat.completions.create(\n        model=TEST_MODEL,\n        messages=messages,\n    )  # get a new response from the model where it can see the function response\n    return second_response.choices[0].message.content\n\nprint(run_conversation())\n```\n\n##### Images\n- Create image example:\n```py\nimport openai\nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nimages_url = client.images.generate(\n  model=\"playground-v2.5\",\n  prompt=\"A cute baby sea otter\",\n  n=2, # The number of images to generate\n  size=\"1792x1024\" # The size of image (view models.json for available sizes)\n)\n\nprint(images_url)\n```\n- Edit image example:\n```py\nimport openai\nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nimages_url = client.images.edit(\n  image=\"https://imgcdn.stablediffusionweb.com/2024/4/29/0b0b8798-1965-4e3d-b0a8-d153728320d4.jpg\",\n  model=\"sdxl\",\n  prompt=\"A cute baby sea otter wearing a raincoat\",\n  n=1, # The number of images to generate\n  size=\"1024x1024\" # The size of image (view models.json for available sizes)\n)\n\nprint(images_url)\n```\n\n##### Models\n- List models example:\n```py\nimport openai\nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nmodels = client.models.list()\n\nprint(models)\n```\n- Retrieve model example:\n```py\nimport openai\nclient = openai.OpenAI(api_key=\"anything\", base_url=\"http://127.0.0.1:8000/v1/\", default_headers={\"Authorization\": \"Bearer anything\"})\n\nmodel = client.models.retrieve(\"gpt-3.5-turbo-instruct\")\n\nprint(model)\n```\n\u003c/details\u003e\n\n### Basic Usage\n\u003cdetails close\u003e\n\u003csummary\u003eRead Docs\u003c/summary\u003e\n\n- Connecting to the API\n```py\ntokens = {\n    'p-b': 'p-b cookie here',\n    'p-lat': 'p-lat cookie here',\n}\n\n# Default setup\nfrom poe_api_wrapper import PoeApi\nclient = PoeApi(tokens=tokens)\n\n# Using Client with auto_proxy (default is False)\nclient = PoeApi(tokens=tokens, auto_proxy=True)\n\n# Passing proxies manually\nproxy_context = [\n    {\"https://\":X1, \"http://\":X1},\n    {\"https://\":X2, \"http://\":X2},\n    ...\n]\n\nclient = PoeApi(tokens=tokens, proxy=proxy_context) \n\n# Add formkey and cloudflare cookies to pass challenges\ntokens = {\n    'p-b': 'p-b cookie here',\n    'p-lat': 'p-lat cookie here',\n    'formkey': 'formkey here',\n    '__cf_bm': '__cf_bm cookie here', \n    'cf_clearance': 'cf_clearance cookie here'\n}\n```\n- Getting Chat Ids \u0026 Chat Codes\n```py\n# Get chat data of all bots (this will fetch all available threads)\nprint(client.get_chat_history()['data'])\n\u003e\u003e Output:\n{'chinchilla': [{'chatId': 74397929, 'chatCode': '2ith0h11zfyvsta1u3z', 'id': 'Q2hhdDo3NDM5NzkyOQ==', 'title': 'Comparison'}], 'code_llama_7b_instruct': [{'chatId': 74397392, 'chatCode': '2ithbduzsysy3g178hb', 'id': 'Q2hhdDo3NDM5NzM5Mg==', 'title': 'Decent Programmers'}], 'a2': [{'chatId': 74396838, 'chatCode': '2ith9nikybn4ksn51l8', 'id': 'Q2hhdDo3NDM5NjgzOA==', 'title': 'Reverse Engineering'}, {'chatId': 74396452, 'chatCode': '2ith79n4x0p0p8w5yue', 'id': 'Q2hhdDo3NDM5NjQ1Mg==', 'title': 'Clean Code'}], 'leocooks': [{'chatId': 74396246, 'chatCode': '2ith82wj0tjrggj46no', 'id': 'Q2hhdDo3NDM5NjI0Ng==', 'title': 'Pizza perfection'}], 'capybara': [{'chatId': 74396020, 'chatCode': '2ith5o3p8c5ajkdwd3k', 'id': 'Q2hhdDo3NDM5NjAyMA==', 'title': 'Greeting'}]}\n\n# Get chat data of a bot (this will fetch all available threads)\nprint(client.get_chat_history(\"a2\")['data'])\n\u003e\u003e Output:\n{'a2': [{'chatId': 74396838, 'chatCode': '2ith9nikybn4ksn51l8', 'id': 'Q2hhdDo3NDM5NjgzOA==', 'title': 'Reverse Engineering'}, {'chatId': 74396452, 'chatCode': '2ith79n4x0p0p8w5yue', 'id': 'Q2hhdDo3NDM5NjQ1Mg==', 'title': 'Clean Code'}]}\n\n# Get a defined number of most recent chat threads (using count param will ignore interval param)\n# Fetching all bots\nprint(client.get_chat_history(count=20)['data'])\n# Fetching 1 bot\nprint(client.get_chat_history(bot=\"a2\", count=20)['data'])\n\n# You can pass the number of bots fetched for each interval to both functions. (default is 50)\n# Fetching 200 chat threads of all bots each interval\nprint(client.get_chat_history(interval=200)['data'])\n# Fetching 200 chat threads of a bot each interval\nprint(client.get_chat_history(bot=\"a2\", interval=200)['data'])\n\n# Pagination Example:\n# Fetch the first 20 chat threads\nhistory = client.get_chat_history(count=20)\npages = [history['data']]\nnew_cursor = history['cursor']\n\n# Set a while loop with a condition of your choice\nwhile new_cursor != None:\n    # Fetch the next 20 chat threads with new_cursor\n    new_history = client.get_chat_history(count=20, cursor=new_cursor)\n    # Append the next 20 chat threads \n    new_cursor = new_history['cursor']\n    pages.append(new_history['data'])\n\n# Print the pages (20 chat threads each page)\nfor page in range(len(pages)):\n    print(f'This is page {page+1}')\n    for bot, value in pages[page].items():\n        for thread in value:\n            print({bot: thread})\n```\n- Getting subscription info and remaining points\n```py\ndata = client.get_settings()\nprint(data)\n```\n- Sending messages \u0026 Streaming responses \n```py\nbot = \"a2\"\nmessage = \"What is reverse engineering?\"\n\n# Create new chat thread\n# Streamed example:\nfor chunk in client.send_message(bot, message):\n    print(chunk[\"response\"], end=\"\", flush=True)\nprint(\"\\n\")\n\n# Non-streamed example:\nfor chunk in client.send_message(bot, message):\n    pass\nprint(chunk[\"text\"])\n\n# You can get chatCode and chatId of created thread to continue the conversation\nchatCode = chunk[\"chatCode\"]\nchatId = chunk[\"chatId\"]\n# You can also retrieve msgPrice\nmsgPrice = chunk[\"msgPrice\"]\n\n# Send message to an existing chat thread\n# 1. Using chatCode\nfor chunk in client.send_message(bot, message, chatCode=\"2i58ciex72dom7im83r\"):\n    print(chunk[\"response\"], end=\"\", flush=True)\n# 2. Using chatId\nfor chunk in client.send_message(bot, message, chatId=59726162):\n    print(chunk[\"response\"], end=\"\", flush=True)\n# 3. Specify msgPrice manually (the wrapper automatically gets this, but you can also pass the param for less resources consumed)\nfor chunk in client.send_message(bot, message, chatId=59726162, msgPrice=msgPrice):\n    print(chunk[\"response\"], end=\"\", flush=True)\n```\n\u003e [!NOTE]\n\u003e Display names are the same as the codenames for custom bots, you can simply pass the bot's display name into `client.send_message(bot, message)`\n- Sending concurrent messages\n```py\n# Use at your own risk, increase timeout to avoid ratelimit (default is 20)\n\nimport time, threading\nthread_count = 0\n\ndef message_thread(prompt, counter):\n    global thread_count\n    try:\n        for chunk in client.send_message(\"gpt3_5\", prompt):\n            pass\n        print(prompt+\"\\n\"+chunk[\"text\"]+\"\\n\"*3)\n        thread_count -= 1\n    except Exception as e:\n        pass\n\nprompts = [\n  \"Write a paragraph about the impact of social media on mental health.\",\n  \"Write a paragraph about the history and significance of the Olympic Games.\",\n  \"Write a paragraph about the effects of climate change on the world's oceans.\",\n  \"Write a paragraph about the benefits and drawbacks of remote work for employees and companies.\",\n  \"Write a paragraph about the role of technology in modern education.\",\n  \"Write a paragraph about the history and impact of the Civil Rights Movement in America.\",\n  \"Write a paragraph about the impact of COVID-19 on global economies.\",\n  \"Write a paragraph about the rise and fall of the Roman Empire.\",\n  \"Write a paragraph about the benefits and drawbacks of genetically modified organisms (GMOs).\",\n  \"Write a paragraph about the impact of globalization on cultural identity.\",\n  \"Write a paragraph about the history and significance of the Mona Lisa painting.\",\n  \"Write a paragraph about the benefits and drawbacks of renewable energy sources.\",\n  \"Write a paragraph about the impact of social media on political discourse.\",\n  \"Write a paragraph about the history and impact of the Industrial Revolution.\",\n  \"Write a paragraph about the benefits and drawbacks of online shopping for consumers and businesses.\",\n  \"Write a paragraph about the impact of artificial intelligence on the job market.\",\n  \"Write a paragraph about the history and significance of the Great Wall of China.\",\n  \"Write a paragraph about the benefits and drawbacks of standardized testing in schools.\",\n  \"Write a paragraph about the impact of the feminist movement on women's rights.\",\n  \"Write a paragraph about the history and impact of the American Revolution.\"\n]\n\n   \nfor i in range(len(prompts)):\n    t = threading.Thread(target=message_thread, args=(prompts[i], i), daemon=True)\n    t.start()\n    thread_count += 1\n    time.sleep(1)\n\nwhile thread_count:\n    time.sleep(0.01)\n```\n- Retrying the last message\n```py\nfor chunk in client.retry_message(chatCode):\n    print(chunk['response'], end='', flush=True)\n```\n- Adding file attachments\n```py\n# Web urls example:\nfile_urls = [\"https://elinux.org/images/c/c5/IntroductionToReverseEngineering_Anderson.pdf\", \n            \"https://www.kcl.ac.uk/warstudies/assets/automation-and-artificial-intelligence.pdf\"]\nfor chunk in client.send_message(bot, \"Compare 2 files and describe them in 300 words\", file_path=file_urls):\n    print(chunk[\"response\"], end=\"\", flush=True)\n    \n# Local paths example:\nlocal_paths = [\"c:\\\\users\\\\snowby666\\\\hello_world.py\"]\nfor chunk in client.send_message(bot, \"What is this file about?\", file_path=local_paths):\n    print(chunk[\"response\"], end=\"\", flush=True)\n```\n\u003e [!NOTE]\n\u003e The files size limit is different for each model.\n- Retrieving suggested replies \n```py\nfor chunk in client.send_message(bot, \"Introduce 5 books about clean code\", suggest_replies=True):\n    print(chunk[\"response\"], end=\"\", flush=True)\nprint(\"\\n\")\n\nfor reply in chunk[\"suggestedReplies\"]:\n    print(reply)\n```\n- Stopping message generation\n```py\n# You can use an event to trigger this function\n# Example:\n# Note that keyboard library may not be compatible with MacOS, Linux, Ubuntu\nimport keyboard\nfor chunk in client.send_message(bot, message):\n    print(chunk[\"response\"], end=\"\", flush=True)\n    # Press Q key to stop the generation\n    if keyboard.is_pressed('q'):\n        client.cancel_message(chunk)\n        print(\"\\nMessage is now cancelled\")\n        break \n```\n- Deleting chat threads\n```py\n# Delete 1 chat\n# Using chatCode\nclient.delete_chat(bot, chatCode=\"2i58ciex72dom7im83r\")\n# Using chatId\nclient.delete_chat(bot, chatId=59726162)\n\n# Delete n chats\n# Using chatCode\nclient.delete_chat(bot, chatCode=[\"LIST_OF_CHAT_CODES\"])\n# Using chatId\nclient.delete_chat(bot, chatId=[\"LIST_OF_CHAT_IDS\"])\n\n# Delete all chats of a bot\nclient.delete_chat(bot, del_all=True)\n```\n- Clearing conversation context\n```py\n# 1. Using chatCode\nclient.chat_break(bot, chatCode=\"2i58ciex72dom7im83r\")\n# 2. Using chatId\nclient.chat_break(bot, chatId=59726162)\n```\n- Purging messages of 1 bot\n  \n```py\n# Purge a defined number of messages (default is 50)\n# 1. Using chatCode\nclient.purge_conversation(bot, chatCode=\"2i58ciex72dom7im83r\", count=10)\n# 2. Using chatId\nclient.purge_conversation(bot, chatId=59726162, count=10)\n\n# Purge all messsages of the thread\n# 1. Using chatCode\nclient.purge_conversation(bot, chatCode=\"2i58ciex72dom7im83r\", del_all=True)\n# 2. Using chatId\nclient.purge_conversation(bot, chatId=59726162,  del_all=True)\n```\n- Purging all messages of user\n```py\nclient.purge_all_conversations()\n```\n- Fetching previous messsages\n```py\n# Get a defined number of messages (default is 50)\n# Using chatCode\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatCode='2itg2a7muygs42v1u0k', count=2)\n# Using chatId\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatId=74411139, count=2)\nfor message in previous_messages:\n    print(message)\n\u003e\u003e Output:\n{'author': 'human', 'text': 'nice to meet you', 'messageId': 2861709279}\n{'author': 'code_llama_34b_instruct', 'text': \" Nice to meet you too! How are you doing today? Is there anything on your mind that you'd like to talk about? I'm here to listen and help\", 'messageId': 2861873125}\n\n# Get messages with extended metadata (state and creationTime)\n# Using chatCode\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatCode='2itg2a7muygs42v1u0k', include_extended=True)\n# Using chatId\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatId=74411139, include_extended=True)\n\u003e\u003e Output:\n{'author': 'human', 'text': 'hi there', 'messageId': 2861363514, 'state': 'complete', 'creationTime': 1732029401216595}\n\n# Get all previous messages\n# Using chatCode\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatCode='2itg2a7muygs42v1u0k', get_all=True)\n# Using chatId\nprevious_messages = client.get_previous_messages('code_llama_34b_instruct', chatId=74411139, get_all=True)\nfor message in previous_messages:\n    print(message)\n\u003e\u003e Output:\n{'author': 'human', 'text': 'hi there', 'messageId': 2861363514}\n{'author': 'code_llama_34b_instruct', 'text': \" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?\", 'messageId': 2861363530}\n{'author': 'chat_break', 'text': \"\", 'messageId': 2872383991}\n{'author': 'human', 'text': 'nice to meet you', 'messageId': 2861709279}\n{'author': 'code_llama_34b_instruct', 'text': \" Nice to meet you too! How are you doing today? Is there anything on your mind that you'd like to talk about? I'm here to listen and help\", 'messageId': 2861873125}\n```\n\u003e [!NOTE]\n\u003e It will fetch messages from the latest to the oldest, but the order to be displayed is reversed.\n- Getting available knowledge bases\n```py\n# Get a defined number of sources (default is 10)\nprint(client.get_available_knowledge(botName=\"BOT_NAME\", count=2))\n\u003e\u003e Output:\n{'What is Quora?': [86698], 'Founders of Quora': [86705]}\n# Get all available sources\nprint(client.get_available_knowledge(botName=\"BOT_NAME\", get_all=True))\n```\n- Uploading knowledge bases\n```py\n# Web urls example:\nfile_urls = [\"https://elinux.org/images/c/c5/IntroductionToReverseEngineering_Anderson.pdf\", \n            \"https://www.kcl.ac.uk/warstudies/assets/automation-and-artificial-intelligence.pdf\"]\nsource_ids = client.upload_knowledge(file_path=file_urls)\nprint(source_ids)\n\u003e\u003e Output:\n{'er-1-intro_to_re.pdf': [86344], 'automation-and-artificial-intelligence.pdf': [86345]}\n\n# Local paths example:\nlocal_paths = [\"c:\\\\users\\\\snowby666\\\\hello_world.py\"]\nsource_ids = client.upload_knowledge(file_path=local_paths)\nprint(source_ids)\n\u003e\u003e Output:\n{'hello_world.py': [86523]}\n\n# Plain texts example:\nknowledges = [\n    {\n        \"title\": \"What is Quora?\",\n        \"content\": \"Quora is a popular online platform that enables users to ask questions on various topics and receive answers from a diverse community. It covers a wide range of subjects, from academic and professional queries to personal experiences and opinions, fostering knowledge-sharing and meaningful discussions among its users worldwide.\"\n    },\n    {\n        \"title\": \"Founders of Quora\",\n        \"content\": \"Quora was founded by two individuals, Adam D'Angelo and Charlie Cheever. Adam D'Angelo, who previously served as the Chief Technology Officer (CTO) at Facebook, and Charlie Cheever, a former Facebook employee as well, launched Quora in June 2009. They aimed to create a platform that would enable users to ask questions and receive high-quality answers from knowledgeable individuals. Since its inception, Quora has grown into a widely used question-and-answer platform with a large user base and a diverse range of topics covered.\"\n    },\n]\nsource_ids = client.upload_knowledge(text_knowledge=knowledges)\nprint(source_ids)\n\u003e\u003e Output:\n{'What is Quora?': [86368], 'Founders of Quora': [86369]}\n\n# Hybrid example:\nsource_ids = client.upload_knowledge(file_path=file_urls, text_knowledge=knowledges)\nprint(source_ids)\n\u003e\u003e Output:\n{'What is Quora?': [86381], 'Founders of Quora': [86383], 'er-1-intro_to_re.pdf': [86395], 'automation-and-artificial-intelligence.pdf': [86396]}\n```\n- Editing knowledge bases (Only for plain texts)\n```py\nclient.edit_knowledge(knowledgeSourceId=86381, title='What is Quora?', content='Quora is a question-and-answer platform where users can ask questions, provide answers, and engage in discussions on various topics.')\n```\n- Getting bot info\n```py\nbot = 'gpt-4'\nprint(client.get_botInfo(handle=bot))\n\u003e\u003e Output:\n{'handle': 'GPT-4', 'model': 'beaver', 'supportsFileUpload': True, 'messageTimeoutSecs': 15, 'displayMessagePointPrice': 350, 'numRemainingMessages': 20, 'viewerIsCreator': False, 'id': 'Qm90OjMwMDc='}\n```\n- Getting available creation models\n```py\nprint(client.get_available_creation_models())\n\u003e\u003e Output:\n{'text': ['claude_3_igloo', 'gpt4_o_mini', 'gpt4_o', 'gemini_1_5_flash', 'gemini_1_5_pro', 'claude_2_1_bamboo', 'claude_3_haiku', 'claude_2_1_cedar', 'gemini_1_5_flash_128k', 'gemini_1_5_pro_128k', 'gemini_1_5_flash_1m', 'gemini_1_5_pro_1m', 'gpt4_o_mini_128k', 'gpt4_o_128k', 'beaver', 'gemini_pro', 'chinchilla', 'vizcacha', 'claude_3_igloo_200k', 'claude_3_sonnet_200k', 'claude_3_haiku_200k', 'claude_3_opus_200k', 'mixtral8x7bchat', 'claude_2_short', 'a2_2', 'mythomaxl213b', 'a2', 'a2_100k'], 'image': ['playgroundv25', 'ideogram', 'dalle3', 'stablediffusion3', 'sd3turbo', 'stablediffusionxl'], 'video': ['pika']}\n```\n- Creating a new Bot\n```py\nclient.create_bot(handle=\"BOT_NAME\", prompt=\"PROMPT_HERE\", base_model=\"a2\")\n\n# Using knowledge bases (you can use source_ids from uploaded knowledge bases for your custom bot)\nclient.create_bot(handle=\"BOT_NAME\", prompt=\"PROMPT_HERE\", base_model=\"a2\", knowledgeSourceIds=source_ids, shouldCiteSources=True)\n```\n- Editing a Bot\n```py\nclient.edit_bot(handle=\"BOT_NAME\", prompt=\"PROMPT_HERE\", new_handle=\"NEW_BOT_NAME\", base_model='chinchilla')\n\n# Adding knowledge bases \nclient.edit_bot(handle=\"BOT_NAME\", prompt=\"PROMPT_HERE\", new_handle=\"NEW_BOT_NAME\", base_model='chinchilla', knowledgeSourceIdsToAdd=source_ids, shouldCiteSources=True)\n\n# Removing knowledge bases\nclient.edit_bot(handle=\"BOT_NAME\", prompt=\"PROMPT_HERE\", new_handle=\"NEW_BOT_NAME\", base_model='chinchilla', knowledgeSourceIdsToRemove=source_ids, shouldCiteSources=True)\n```\n\u003e [!TIP]\n\u003e You can also use both `knowledgeSourceIdsToAdd` and `knowledgeSourceIdsToRemove` at the same time.\n- Deleting a Bot\n```py\nclient.delete_bot(handle=\"BOT_NAME\")\n```\n- Getting available bots (your bots section)\n```py\n# Get a defined number of bots (default is 25)\nprint(client.get_available_bots(count=10))\n# Get all available bots\nprint(client.get_available_bots(get_all=True))\n```\n- Getting a user's bots\n```py\nhandle = 'poe'\nprint(client.get_user_bots(user=handle))\n```\n- Getting available categories\n```py\nprint(client.get_available_categories())\n\u003e\u003e Output:\n['Official', 'Popular', 'New', 'ImageGen', 'AI', 'Professional', 'Funny', 'History', 'Cooking', 'Advice', 'Mind', 'Programming', 'Travel', 'Writing', 'Games', 'Learning', 'Roleplay', 'Utilities', 'Sports', 'Music']\n```\n- Exploring 3rd party bots and users\n```py\n# Explore section example:\n# Get a defined number of bots (default is 50)\nprint(client.explore(count=10))\n# Get all available bots\nprint(client.explore(explore_all=True))\n\n# Search for bots by query example:\n# Get a defined number of bots (default is 50)\nprint(client.explore(search=\"Midjourney\", count=30))\n# Get all available bots\nprint(client.explore(search=\"Midjourney\", explore_all=True))\n\n# Search for bots by category example (default is defaultCategory):\n# Get a defined number of bots (default is 50)\nprint(client.explore(categoryName=\"Popular\", count=30))\n# Get all available bots\nprint(client.explore(categoryName=\"AI\", explore_all=True))\n\n# Search for people example:\n# Get a defined number of people (default is 50)\nprint(client.explore(search=\"Poe\", entity_type='user', count=30))\n# Get all available people\nprint(client.explore(search=\"Poe\", entity_type='user', explore_all=True))\n```\n- Sharing \u0026 Importing messages\n```py\n# Share a defined number of messages (from the lastest to the oldest)\n# Using chatCode\nshareCode = client.share_chat(\"a2\", chatCode=\"2roap5g8nd7s28ul836\",count=10)\n# Using chatId\nshareCode = client.share_chat(\"a2\", chatId=204052028,count=10)\n\n# Share all messages\n# Using chatCode\nshareCode = client.share_chat(\"a2\", chatCode=\"2roap5g8nd7s28ul836\")\n# Using chatId\nshareCode = client.share_chat(\"a2\", chatId=204052028)\n\n# Set up the 2nd Client and import messages from the shareCode\nclient2 = PoeApi(\"2nd_TOKEN_HERE\")\nprint(client2.import_chat(bot, shareCode))\n\u003e\u003e Output:\n{'chatId': 72929127, 'chatCode': '2iw0xcem7a18wy1avd3'}\n```\n- Getting citations\n```py\nprint(client.get_citations(messageId=141597902621))\n```\n\u003c/details\u003e\n\n### Bots Group Chat\n\u003cdetails close\u003e\n\u003csummary\u003eRead Docs\u003c/summary\u003e\n\n- Creating a group chat\n```py\nbots = [\n    {'bot': 'yayayayaeclaude', 'name': 'Yae'}, \n    {'bot': 'gepardL', 'name': 'gepard'}, \n    {'bot': 'SayukiTokihara', 'name': 'Sayuki'}\n]\n\nclient.create_group(group_name='Hangout', bots=bots) \n```\n\u003e [!NOTE]\n\u003e `bot` arg is the model/displayName.\n\u003e `name` arg is the one you'd mention them in group chat.\n- Sending messages and Streaming responses in group chat\n```py\n# User engagement example:\nwhile True: \n    message = str(input('\\n\\033[38;5;121mYou : \\033[0m'))\n    prev_bot = \"\"\n    for chunk in client.send_message_to_group(group_name='Hangout', message=message):\n        if chunk['bot'] != prev_bot:\n            print(f\"\\n\\033[38;5;121m{chunk['bot']} : \\033[0m\", end='', flush=True)\n            prev_bot = chunk['bot']\n        print(chunk['response'], end='', flush=True)\n    print('\\n')\n\n# Auto-play example:\nwhile True:\n    prev_bot = \"\"\n    for chunk in client.send_message_to_group(group_name='Hangout', autoplay=True):\n        if chunk['bot'] != prev_bot:\n            print(f\"\\n\\033[38;5;121m{chunk['bot']} : \\033[0m\", end='', flush=True)\n            prev_bot = chunk['bot']\n        print(chunk['response'], end='', flush=True)\n    print('\\n')\n\n# Preset history example:\npreset_path = \"c:\\\\users\\\\snowby666\\\\preset.json\"\nprev_bot = \"\"\nfor chunk in client.send_message_to_group(group_name='Hangout', autoplay=True, preset_history=preset_path):\n    if chunk['bot'] != prev_bot:\n        print(f\"\\n\\033[38;5;121m{chunk['bot']} : \\033[0m\", end='', flush=True)\n        prev_bot = chunk['bot']\n    print(chunk['response'], end='', flush=True)\nprint('\\n')\nwhile True:\n    for chunk in client.send_message_to_group(group_name='Hangout', autoplay=True):\n        if chunk['bot'] != prev_bot:\n            print(f\"\\n\\033[38;5;121m{chunk['bot']} : \\033[0m\", end='', flush=True)\n            prev_bot = chunk['bot']\n        print(chunk['response'], end='', flush=True)\n    print('\\n')\n```\n\u003e [!NOTE]\n\u003e You can also change your name in group chat by passing a new one to the above function: `client.send_message_to_group('Hangout', message=message, user='Danny')`\n\u003e If you want to auto save the conversation_log, just simply set this to true: `client.send_message_to_group('Hangout', message=message, autosave=True)`\n- Deleting a group chat\n```py\nclient.delete_group(group_name='Hangout')\n```\n- Getting created groups\n```py\nprint(client.get_available_groups())\n```\n- Getting group data\n```py\nprint(client.get_group(group_name='Hangout'))\n```\n- Saving group chat history\n```py\n# Save as json in the same directory\nclient.save_group_history(group_name='Hangout')\n# Save with a local path (json only)\nlocal_path = \"c:\\\\users\\\\snowby666\\\\log.json\"\nclient.save_group_history(group_name='Hangout', file_path=local_path)\n```\n- Loading group chat history\n```py\nprint(client.load_group_history(file_path=local_path))\n```\n\u003c/details\u003e\n\n### Misc\n\u003cdetails close\u003e\n\u003csummary\u003eRead Docs\u003c/summary\u003e\n\n- How to find chatCode manually?\n\nHere is an example, the chatCode is 23o1gxjhb9cfnlacdcd\n\n![](https://i.imgur.com/m1zDP36.png)\n\n- What are the file types that poe-api-wrapper support?\n\nCurrently, this API only supports these file types for adding attachments\n\n#### Text files\n| .pdf | .docx | .txt | .md | .py | .js | .ts | .html | .css | .csv | .c | .cs | .cpp | .lua | .rs | .rb | .go | .java |\n| - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |\n|                                                                       |\n#### Media files\n| .png | .jpg | .jpeg | .gif | .mp4 | .mov | .mp3 | .wav |\n| - | - | - | - | - | - | - | - |\n|                               |\n\u003c/details\u003e\n\n## 🙌 Contributing\nWe would love to develop poe-api-wrapper together with our community! 💕\n### Run debug\nFirst, clone this repo:\n```ShellSession\ngit clone https://github.com/snowby666/poe-api-wrapper.git\ncd poe-api-wrapper\n```\nThen run the test cases:\n```ShellSession\npython -m pip install -e .[tests]\ntox\n```\n### Ways to contribute\n- Try poe-api-wrapper and give feedback\n- Add new integrations with open [PR](https://github.com/snowby666/poe-api-wrapper/pulls)\n- Help with open [issues](https://github.com/snowby666/poe-api-wrapper/issues) or [create your own](https://github.com/snowby666/poe-api-wrapper/issues/new/choose)\n- Share your thoughts and suggestions with us\n- Request a feature by submitting a proposal\n- Report a bug\n- **Improve documentation:** fix incomplete or missing docs, bad wording, examples or explanations.\n\n### Contributors\n\u003ca href=\"https://github.com/snowby666/poe-api-wrapper/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=snowby666/poe-api-wrapper\" /\u003e\n\u003c/a\u003e\n\n\u003cbr\u003e\n\n\u003cimg src=\"https://repobeats.axiom.co/api/embed/cba15fced158acd258575d31fc14d7e5c59b07a3.svg\" alt=\"Repobeats analytics image\"\u003e\n\n## 🤝 Copyright\nThis program is licensed under the [GNU GPL v3](https://github.com/snowby666/poe-api-wrapper/blob/main/LICENSE). Most code has been written by me, [snowby666](https://github.com/snowby666).\n\n### Copyright Notice\n```\nsnowby666/poe-api-wrapper: A simple, lightweight and efficient API wrapper for Poe.com\nCopyright (C) 2023 snowby666\n\nThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program.  If not, see \u003chttps://www.gnu.org/licenses/\u003e.\n```\n","funding_links":[],"categories":["Python","Chatbots"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsnowby666%2Fpoe-api-wrapper","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsnowby666%2Fpoe-api-wrapper","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsnowby666%2Fpoe-api-wrapper/lists"}