{"id":28443161,"url":"https://github.com/hardcodedev777/blazelama","last_synced_at":"2025-06-29T11:31:18.168Z","repository":{"id":292019989,"uuid":"978941470","full_name":"HardCodeDev777/BlazeLama","owner":"HardCodeDev777","description":"Blazor-based GUI for chatting with Ollama models on Windows","archived":true,"fork":false,"pushed_at":"2025-05-15T11:15:41.000Z","size":81666,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-06-06T07:02:04.000Z","etag":null,"topics":["blazor","blazor-application","blazor-server","llm","llms","ollama","ollama-gui"],"latest_commit_sha":null,"homepage":"","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/HardCodeDev777.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-05-06T18:37:33.000Z","updated_at":"2025-06-05T12:08:38.000Z","dependencies_parsed_at":"2025-05-07T18:55:34.489Z","dependency_job_id":null,"html_url":"https://github.com/HardCodeDev777/BlazeLama","commit_stats":null,"previous_names":["hardcodedev777/blazelama"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/HardCodeDev777/BlazeLama","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FBlazeLama","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FBlazeLama/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FBlazeLama/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FBlazeLama/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/HardCodeDev777","download_url":"https://codeload.github.com/HardCodeDev777/BlazeLama/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FBlazeLama/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":262585768,"owners_count":23332718,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["blazor","blazor-application","blazor-server","llm","llms","ollama","ollama-gui"],"created_at":"2025-06-06T07:01:00.712Z","updated_at":"2025-06-29T11:31:18.151Z","avatar_url":"https://github.com/HardCodeDev777.png","language":"C#","funding_links":[],"categories":[],"sub_categories":[],"readme":"![Ollama](https://img.shields.io/badge/Ollama-%23000000?logo=Ollama)\n![Blazor](https://img.shields.io/badge/Blazor-%23512BD4?logo=Blazor)\n![License](https://img.shields.io/github/license/HardCodeDev777/BlazeLama?color=%2305991d)\n![Last commit](https://img.shields.io/github/last-commit/HardCodeDev777/BlazeLama?color=%2305991d)\n![Tag](https://img.shields.io/github/v/tag/HardCodeDev777/BlazeLama)\n![Top Lang](https://img.shields.io/github/languages/top/HardCodeDev777/BlazeLama)\n \n\u003cdiv align=\"center\"\u003e\n    \u003cimg src=\"logo.png\" width=\"256\"\u003e\n\u003c/div\u003e\n\n# 🦙 BlazeLama\n\n**BlazeLama** is (hopefully) the first user-friendly, lightweight desktop chat client for [Ollama](https://ollama.com), built with **Blazor Server** and **WPF WebView**.\n \n---\n\n## 🚀 Overview\n\nBlazeLama provides a simple and responsive UI for chatting with local Ollama models on Windows. Features include:\n\n- 🗨️ Create and delete chats, assign models to each chat\n- 🔧 Automatically launch Ollama when this app starts (if a path is provided in settings)\n- 💾 Store chat history and models in local JSON files\n\nIt works with any model downloaded via the [Ollama CLI](https://ollama.com/library).\n\n\u0026nbsp;\n\u0026nbsp;\n\n![BlazeLama](./app.gif)\n\n\n---\n\n## 📦 Installation\n\n\u003e **Requirement:** [Download and install Ollama](https://ollama.com) first.\n\n1. Download the latest `.zip` archive from the **Releases** page  \n2. Extract it anywhere  \n3. Launch `BlazeLamaDesktop.exe`  \n4. (Optional) In settings, specify the full path to `ollama.exe` for automatic startup\n\n---\n\n## 🔨 Usage\n\nAfter launching, you'll see the **Home menu** and a **\"Create new chat\"** button. Clicking it will open a dialog where you can define:\n\n| Field          | Description                            |\n|----------------|----------------------------------------|\n| **Chat name**  | Display name for your chat             |\n| **Model name** | Model you want to use for this session |\n\n\u003cdiv\u003e\n    \u003cimg src=\"ChatPopup.png\" width=\"512\"\u003e\n\u003c/div\u003e\n\n\u0026nbsp;\n\u0026nbsp;\n\nOn the left sidebar, you'll see your chat list and the **Home** button.\n\n\u003e [!WARNING]  \n\u003e A chat is saved only after at least one message is sent.\n\n\n\u003cdiv\u003e\n    \u003cimg src=\"ChatList.png\" width=\"410\"\u003e\n\u003c/div\u003e\n\n\u0026nbsp;\n\u0026nbsp;\n\nMessages, model names, and chat titles are stored in the `Data` folder, located next to the executable. Stored as `.json` files.\n\n\u003e [!TIP]  \n\u003e If the app fails to load or gets stuck, you can manually delete corrupted data by closing the app and removing all JSON files from the `Data` folder.\n\n\u0026nbsp;\n\nYou can delete any chat by clicking the 🗑 icon next to it.\n\nThere is **no limit on the number of chats**. The message length and memory limitations depend on the model you use.\n\n---\n\n### ⚙️ Settings\n\nWhile on the Home screen, click the ⚙️ icon in the top right corner to open the Settings popup:\n\n\u003cdiv\u003e\n    \u003cimg src=\"Settings.png\" width=\"410\"\u003e\n\u003c/div\u003e\n\n| Setting                   | Description |\n|---------------------------|-------------|\n| **Path to Ollama app.exe** | If you want BlazeLama to automatically launch Ollama, enter the full path to `ollama.exe` here |\n| **Delete data**           | Deletes **all chat data**, including histories, chat names, and model references |\n\n---\n## TODO\n\n- [ ]  Refactor some code areas to look cleaner 🙂\n- [ ]  Better error handling\n---\n\n## Credits for Styles\n\nSome styles used in this project were borrowed from an open-source template found online. Unfortunately, I couldn't locate the original source or author. If you're the original creator, feel free to contact me so I can give a credit or remove the styles if needed.\n\n---\n\n## 📄 License\n\nThis project is licensed under the **MIT License**.  \nSee LICENSE for full terms.\n\n---\n\n## 👨‍💻 Author\n\n**HardCodeDev**  \n-  [GitHub](https://github.com/HardCodeDev777)  \n-  [Itch.io](https://hardcodedev.itch.io/)\n\n---\n\n\u003e 💬 Got feedback, found a bug, or want to contribute? Open an issue or fork the repo on GitHub!\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhardcodedev777%2Fblazelama","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhardcodedev777%2Fblazelama","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhardcodedev777%2Fblazelama/lists"}