https://github.com/daveschumaker/super-simple-chatui
A simple interface for interacting with LLMs via a local installation of Ollama
https://github.com/daveschumaker/super-simple-chatui
llama3 llms ollama ollama-gui reactjs typescript
Last synced: 7 months ago
JSON representation
A simple interface for interacting with LLMs via a local installation of Ollama
- Host: GitHub
- URL: https://github.com/daveschumaker/super-simple-chatui
- Owner: daveschumaker
- License: mit
- Created: 2024-05-22T16:37:08.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-22T17:07:10.000Z (over 1 year ago)
- Last Synced: 2025-03-17T21:41:21.564Z (7 months ago)
- Topics: llama3, llms, ollama, ollama-gui, reactjs, typescript
- Language: TypeScript
- Homepage:
- Size: 319 KB
- Stars: 2
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Super Simple ChatUI

## Overview
_Super Simple ChatUI_ is a React/TypeScript project that provides a simple and intuitive frontend UI for interacting with a local LLM (Large Language Model) on through [Ollama](https://github.com/ollama/ollama). This project enables users to interact with their own LLMs locally, ensuring privacy and control over their data.
This project was setup using [Vite](https://vitejs.dev/), which allows for rapid development thanks to features like Hot Module Replacement, support for TypeScript, CSS modules and more.
## Installation
### Prerequisites
- Node (v18.17.0 or later)
- Ollama### Steps
1. Clone the repository
```bash
> git clone https://github.com/daveschumaker/super-simple-chatui.git
> cd super-simple-chatui
```2. Install dependencies
```bash
> npm install
```3. Run development server
```bash
> npm run dev
```4. Access application by visiting the link in your terminal (I believe Vite uses: [http://localhost:5173](http://localhost:5173))
### Usage
1. Ensure that Ollama is running on your machine and exposes its API at: `http://localhost:11434`
2. Interact with LLM: Use the super-simple-chatui interface to send queries to Ollama and receive responses.## TODOs
- Add support for IndexedDb via Dexie (longer term storage for conversations, system prompts, various settings, etc)
- Add support for picking from available models via Ollama
- Add support for chatting with models via the [AI Horde](https://horde.koboldai.net/)
- Add support for OpenAI's ChatGPT API via API key
- Write tests! Always with the tests.### Contributing
Contributions are welcome! Please follow these steps to contribute:
1. Fork the repository.
2. Create a new branch (git checkout -b feature-branch).
3. Make your changes.
4. Commit your changes (git commit -m 'Add new feature').
5. Push to the branch (git push origin feature-branch).
6. Open a pull request.## License
This project is licensed under the MIT License. See the [LICENSE](/LICENSE) file for details.
## Acknowledgments
- Chat logo / favicon via Flaticon: [Bubble Chat free icon](https://www.flaticon.com/free-icon/chat_9356600)
- [Ollama](https://github.com/ollama/ollama)
- [Vite](https://vitejs.dev/)