{"id":23636423,"url":"https://github.com/skhelladi/OllamaDesk","last_synced_at":"2025-08-31T11:33:47.889Z","repository":{"id":270001486,"uuid":"909069220","full_name":"skhelladi/i-Assistant","owner":"skhelladi","description":"i-@ssistant is a web-based AI assistant application that allows users to interact with various models to obtain solutions to their queries. This application serves as a graphical interface for Ollama.","archived":false,"fork":false,"pushed_at":"2024-12-27T18:25:27.000Z","size":48,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2024-12-27T19:23:18.070Z","etag":null,"topics":["ai","llm","ollama","ollama-gui"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/skhelladi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-12-27T16:55:27.000Z","updated_at":"2024-12-27T18:25:30.000Z","dependencies_parsed_at":"2024-12-27T19:23:19.539Z","dependency_job_id":"d088ae77-4d00-49a3-80ca-a7024c8e8944","html_url":"https://github.com/skhelladi/i-Assistant","commit_stats":null,"previous_names":["skhelladi/i-assistant"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/skhelladi%2Fi-Assistant","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/skhelladi%2Fi-Assistant/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/skhelladi%2Fi-Assistant/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/skhelladi%2Fi-Assistant/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/skhelladi","download_url":"https://codeload.github.com/skhelladi/i-Assistant/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":231590469,"owners_count":18396929,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","llm","ollama","ollama-gui"],"created_at":"2024-12-28T06:12:18.245Z","updated_at":"2025-08-31T11:33:47.876Z","avatar_url":"https://github.com/skhelladi.png","language":"JavaScript","funding_links":[],"categories":[],"sub_categories":[],"readme":"# OllamaDesk\n\nOllamaDesk serves as a graphical interface for [Ollama](https://ollama.com/) using free LLMs. The backend server is responsible for model management and processing.\n\n![OllamaDesk](assets/gui.png)\n\n## Features\n\n- Interactive chat interface\n- Model selection\n- Streamed responses\n- Retry and stop functionality\n- Settings for customization\n- Copy and paste functionality\n- Save answers to a text/markdown file\n- Save chat history\n- Dark/light mode\n\n## Installation\n\n### Prerequisites\n\n- Node.js (version 14 or higher)\n- npm (Node Package Manager)\n\n### Clone the Repository\n\n```bash\ngit clone https://github.com/skhelladi/OllamaDesk.git\ncd OllamaDesk\n```\n\n### Install Dependencies\n\n```bash\nnpm install express\nnpm install sqlite3\nnpm install ollama\nnpm install crypto-js\nnpm install node-cache\nnpm install nodemon\nnpm install ip\n```\n\n### Install Ollama and its Modules\n\nFollow the instructions on the [Ollama website](https://ollama.com/) to install Ollama and its required modules.\n\n## Usage\n\n### Start the Server\n\n```bash\nnode server.mjs\n```\n\nThe server will start and listen on port 3333 (default). You can access the application by navigating to `http://localhost:3333` in your web browser.\n\n### Start the Server in Development Mode\n\n```bash\nnpm run dev\n```\n\nThis will start the server using `nodemon`, which will automatically restart the server when file changes are detected.\n\n### Functionality\n\n- **Send Message**: Enter your message in the input field and click the send button to interact with the AI assistant.\n- **Stop Request**: Click the stop button to stop the ongoing response generation.\n- **Retry**: Click the retry button to resend the last user message and get a new response.\n- **Settings**: Click the settings button to customize the stream, temperature, and model context.\n\n## License\n\nThis project is licensed under the GPL-3 license.\n\nUnless you explicitly state otherwise, any contribution intentionally submitted by you for inclusion in this project shall be licensed as above, without any additional terms or conditions.\n\n## Author\n\n- Sofiane KHELLADI\n\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fskhelladi%2FOllamaDesk","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fskhelladi%2FOllamaDesk","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fskhelladi%2FOllamaDesk/lists"}