{"id":24020168,"url":"https://github.com/PySpur-Dev/PySpur","last_synced_at":"2025-09-14T20:31:52.380Z","repository":{"id":263900282,"uuid":"861892731","full_name":"PySpur-Dev/pyspur","owner":"PySpur-Dev","description":"Graph-Based Editor for LLM Workflows","archived":false,"fork":false,"pushed_at":"2025-01-05T13:27:06.000Z","size":4967,"stargazers_count":1087,"open_issues_count":4,"forks_count":77,"subscribers_count":15,"default_branch":"main","last_synced_at":"2025-01-05T14:29:50.229Z","etag":null,"topics":["agent","agents","ai","javascript","llm","llm-inference","openai","python","react","workflow"],"latest_commit_sha":null,"homepage":"https://pyspur.dev","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/PySpur-Dev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-09-23T17:24:52.000Z","updated_at":"2025-01-05T13:27:10.000Z","dependencies_parsed_at":"2024-12-14T18:27:40.343Z","dependency_job_id":"c8eaac1d-ed72-4299-a298-39a763c4d01a","html_url":"https://github.com/PySpur-Dev/pyspur","commit_stats":null,"previous_names":["pyspur-com/pyspur","pyspur-dev/pyspur"],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PySpur-Dev%2Fpyspur","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PySpur-Dev%2Fpyspur/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PySpur-Dev%2Fpyspur/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PySpur-Dev%2Fpyspur/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/PySpur-Dev","download_url":"https://codeload.github.com/PySpur-Dev/pyspur/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":233026988,"owners_count":18613583,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agent","agents","ai","javascript","llm","llm-inference","openai","python","react","workflow"],"created_at":"2025-01-08T12:00:49.689Z","updated_at":"2025-01-08T12:02:04.869Z","avatar_url":"https://github.com/PySpur-Dev.png","language":"TypeScript","readme":"# PySpur - Graph-Based Editor for LLM Workflows\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"./README.md\"\u003e\u003cimg alt=\"README in English\" src=\"https://img.shields.io/badge/English-blue\"\u003e\u003c/a\u003e\n  \u003ca href=\"./README_CN.md\"\u003e\u003cimg alt=\"简体中文版自述文件\" src=\"https://img.shields.io/badge/简体中文-blue\"\u003e\u003c/a\u003e\n  \u003ca href=\"./README_JA.md\"\u003e\u003cimg alt=\"日本語のREADME\" src=\"https://img.shields.io/badge/日本語-blue\"\u003e\u003c/a\u003e\n  \u003ca href=\"./README_KR.md\"\u003e\u003cimg alt=\"README in Korean\" src=\"https://img.shields.io/badge/한국어-blue\"\u003e\u003c/a\u003e\n  \u003ca href=\"./README_DE.md\"\u003e\u003cimg alt=\"Deutsche Version der README\" src=\"https://img.shields.io/badge/Deutsch-blue\"\u003e\u003c/a\u003e\n\u003ca href=\"./README_FR.md\"\u003e\u003cimg alt=\"Version française du README\" src=\"https://img.shields.io/badge/Français-blue\"\u003e\u003c/a\u003e\n\u003ca href=\"./README_ES.md\"\u003e\u003cimg alt=\"Versión en español del README\" src=\"https://img.shields.io/badge/Español-blue\"\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://discord.gg/7Spn7C8A5F\"\u003e\n    \u003cimg alt=\"Join Our Discord\" src=\"https://img.shields.io/badge/Discord-7289DA.svg?style=for-the-badge\u0026logo=discord\u0026logoColor=white\"\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n\nhttps://github.com/user-attachments/assets/9128885b-47ba-4fc6-ab6b-d567f52e332c\n\n# ✨ Core Benefits\n\n## Modular Building Blocks\n\nhttps://github.com/user-attachments/assets/6442f0ad-86d8-43d9-aa70-e5c01e55e876\n\n## Debug at Node Level:\n\nhttps://github.com/user-attachments/assets/6e82ad25-2a46-4c50-b030-415ea9994690\n\n## Evaluate Final Performance\n\nhttps://github.com/user-attachments/assets/4dc2abc3-c6e6-4d6d-a5c3-787d518de7ae\n\n## Coming soon: Self-improvement\n\nhttps://github.com/user-attachments/assets/5bef7a16-ef9f-4650-b385-4ea70fa54c8a\n\n\n# 🕸️ Why PySpur?\n\n* **Easy-to-hack**, eg., one can add new workflow nodes by simply creating a single Python file.\n* **JSON configs** of workflow graphs, enabling easy sharing and version control.\n* **Lightweight** via minimal dependencies, avoiding bloated LLM frameworks.\n\n# ⚡ Quick start \n\nYou can launch PySpur using pre-built docker images in the following steps:\n\n1. **Clone the repository:**\n    ```sh\n    git clone https://github.com/PySpur-com/pyspur.git\n    cd pyspur\n    ```\n\n2. **Create a .env file:**\n\n    Create a `.env` file at the root of the project. You may use `.env.example` as a starting point.\n    ```sh\n    cp .env.example .env\n    ```\n    **Please go through the .env file and change configs wherver necessary**\n    **If you plan to use third party model providers, please add their API keys in the .env file in this step**.\n\n3. **Start the docker services:**\n\n    ```sh\n    docker compose -f ./docker-compose.prod.yml up --build -d\n    ```\n\n    This will start a local instance of PySpur that will store spurs and other state information in a postgres database. A local postgres service is used by default. Override `POSTGRES_*` variables in the `.env` file to use an external postgres database.\n\n4. **Access the portal:**\n\n    Go to `http://localhost:6080/` in your browser.\n\n\nSet up is completed. Click on \"New Spur\" to create a workflow, or start with one of the stock templates.\n\n\n5. **[Optional] Manage your LLM provider keys from the app:**\n\n   Once PySpur app is running you can manage your LLM provider keys through the portal:\n\n   \u003cimg width=\"1913\" alt=\"image\" src=\"https://github.com/user-attachments/assets/32fe79f1-f518-4df5-859c-1d1c0fc0570e\" /\u003e\n\n   Select API keys tab\n\n   \u003cimg width=\"441\" alt=\"image\" src=\"https://github.com/user-attachments/assets/cccc7e27-c10b-4f3a-b818-3b65c55f4170\" /\u003e\n\n   Enter your provider's key and click save (save button will appear after you add/modify a key)\n\n   \u003cimg width=\"451\" alt=\"image\" src=\"https://github.com/user-attachments/assets/e35ba2bb-4c60-4b13-9a8d-cc47cac45375\" /\u003e\n\n\n# 🛠️ PySpur Development Setup\n#### [ Instructions for development on Unix-like systems. Development on Windows/PC not tested ]\n\nThe steps for dev setup are same as above, except for step 3: we launch the app in the dev mode instead\n\n3. **Start the docker services:**\n\n    ```sh\n    docker compose up --build -d\n    ```\n\n    This will start a local instance of PySpur that will store spurs and other state information in a postgres database. A local postgres service is used by default. Override `POSTGRES_*` variables in the `.env` file to use an external postgres database.\n\n\n# 🦙 Using PySpur with Ollama (Local Models)\n\nPySpur can work with local models served using Ollama.\n\nSteps to configure PySpur to work with Ollama running on the same host.\n\n### 1. Configure Ollama\nTo ensure Ollama API is reachable from PySpur, we need to start the Ollama service with environment variable `OLLAMA_HOST=0.0.0.0` . This allows requests coming from PySpur docker's bridge network to get through to Ollama. \nAn easy way to do this is to launch the ollama service with the following command:\n```sh\nOLLAMA_HOST=\"0.0.0.0\" ollama serve\n```\n\n### 2. Update the PySpur .env file\nNext up we need to update the `OLLAMA_BASE_URL` environment value in the `.env` file.\nIf your Ollama port is 11434 (the default port), then the entry in `.env` file should look like this:\n```sh\nOLLAMA_BASE_URL=http://host.docker.internal:11434 \n```\n(Please make sure that there is no trailing slash in the end!)\n\nIn PySpur's set up, `host.docker.internal` refers to the host machine where both PySpur and Ollama are running.\n\n### 3. Launch the PySpur app\nFollow the usual steps to launch the PySpur app, starting with the command:\n```sh\ndocker compose -f docker-compose.prod.yml up --build -d\n```\n\nIf you wish to do PySpur development with ollama please run the following command instead of above:\n```sh\ndocker compose -f docker-compose.yml up --build -d\n```\n\n\n### 4. Using Ollama models in the app\nYou will be able to select Ollama models [`ollama/llama3.2`, `ollama/llama3`, ...] from the sidebar for LLM nodes.\nPlease make sure the model you select is explicitly downloaded in ollama. That is, you will need to manually manage these models via ollama. To download a model you can simply run `ollama pull \u003cmodel-name\u003e`.\n\n## Note on supported models\nPySpur only works with models that support structured-output and json mode. Most newer models should be good, but it would still be good to confirm this from Ollama documentation for the model you wish to use.\n\n# ⭐ Support us\n\nYou can support us in our work by leaving a star! Thank you!\n\n![star](https://github.com/user-attachments/assets/71f65273-6755-469d-be44-087bb89d5e76)\n\n# 🗺️ Roadmap\n\n- [X] Canvas\n- [X] Async/Batch Execution\n- [X] Evals\n- [X] Spur API\n- [x] Support Ollama\n- [ ] New Nodes\n    - [X] LLM Nodes\n    - [X] If-Else\n    - [X] Merge Branches\n    - [ ] Tools\n    - [ ] Loops\n- [ ] RAG\n- [ ] Pipeline optimization via DSPy and related methods\n- [ ] Templates\n- [ ] Compile Spurs to Code\n- [ ] Multimodal support\n- [ ] Containerization of Code Verifiers\n- [ ] Leaderboard\n- [ ] Generate Spurs via AI\n\nYour feedback will be massively appreciated.\nPlease [tell us](mailto:founders@pyspur.dev?subject=Feature%20Request\u0026body=I%20want%20this%20feature%3Ai) which features on that list you like to see next or request entirely new ones.\n\n\n\n","funding_links":[],"categories":["A01_文本生成_文本对话"],"sub_categories":["大语言对话模型及数据"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPySpur-Dev%2FPySpur","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FPySpur-Dev%2FPySpur","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPySpur-Dev%2FPySpur/lists"}