https://github.com/shekharp1536/ollama-web
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
https://github.com/shekharp1536/ollama-web
llama llama-cpp llama3 llm-inference ollama ollama-app ollama-chat ollama-client ollama-gui ollama-interface ollama-python ollama-ui ollama-webui python-llm-integration
Last synced: 6 months ago
JSON representation
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
- Host: GitHub
- URL: https://github.com/shekharp1536/ollama-web
- Owner: shekharP1536
- Created: 2024-11-06T23:25:08.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-12-02T15:37:28.000Z (7 months ago)
- Last Synced: 2024-12-22T01:55:45.256Z (6 months ago)
- Topics: llama, llama-cpp, llama3, llm-inference, ollama, ollama-app, ollama-chat, ollama-client, ollama-gui, ollama-interface, ollama-python, ollama-ui, ollama-webui, python-llm-integration
- Language: JavaScript
- Homepage: https://ollama-web.vercel.app
- Size: 1.17 MB
- Stars: 2
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# ✨💻Ollama Web💡🤝
Ollama Web is a web-based interface for interacting with the Ollama model. This README will guide you through setting up and running the project on both Windows and Linux systems.
## 📷 PreviewTake a quick look at the Ollama Web interface:
### Images
Below are some snapshots of the interface:1. **Chat UI**
2. **Models page**
3. **other**
---
## PrerequisitesBefore running the project, you need to download and set up **Ollama** on your system.
### 1. 🔽 Download and Install Ollama
### For Windows Users:
1. Visit the [Ollama download page](https://ollama.com/download) for Windows.
2. Download the Windows installer and run it.
3. Follow the installation instructions to complete the process.### For Linux Users:
1. Open your terminal and run the following command to install Ollama:```bash
curl -fsSL https://ollama.com/install.sh | bash
```2. After installation, verify that Ollama is installed successfully by running:
```bash
ollama --version
```### 2. 🧷 Clone the Repository
To get the code, you need to clone the `ollama-web` repository from GitHub. Use the following command to clone the repository:
```bash
git clone https://github.com/shekharP1536/ollama-web.git
cd ollama-web
```
### 3. 📥 Install Required Python PackagesOnce Ollama is installed, you need to install the necessary Python packages to run the project.
1. Create a virtual environment (optional but recommended) to keep dependencies isolated:
```bash
python3 -m venv venv
```2. Activate the virtual environment:
- **On Windows:**
```bash
.\venv\Scripts\activate
```- **On Linux/Mac:**
```bash
source venv/bin/activate
```3. Install the required Python packages:
```bash
pip install -r requirements.txt
```### 4. 🚀 Run the Application
After cloning the repository, navigate to the project directory and run the `index.py` script to start the web application:
```bash
python index.py
```### 5. 🌐 Access the Application
Once the application is running, open your web browser and go to:
```
http://localhost:5000
```This will open the Ollama Web interface, and you can start using it locally!
---
## 🛠️ Troubleshooting
- Ensure that you've followed all the installation steps for your operating system (Windows or Linux).
- Verify that Python and all required dependencies are installed correctly by running `pip install -r requirements.txt`.
- If you encounter any issues with Ollama, try restarting your system or checking the [Ollama documentation](https://ollama.com/docs) for more help.---
Enjoy using Ollama Web! If you have any questions or need assistance, feel free to open an issue on the [GitHub repository](https://github.com/shekharP1536/ollama-web).
## 📬 Contact
If you have any questions or need assistance, feel free to reach out:
- 📧 **Email:** [[email protected]](mailto:[email protected])
- 🔗 **LinkedIn:** [Shekhar](https://www.linkedin.com/in/chandrashekhar-pachlore/)