https://github.com/asierso/ochat-app
Mobile chat client for Ollama AI
https://github.com/asierso/ochat-app
android-app chatbot ollama-client
Last synced: about 2 months ago
JSON representation
Mobile chat client for Ollama AI
- Host: GitHub
- URL: https://github.com/asierso/ochat-app
- Owner: Asierso
- Created: 2024-09-23T05:25:40.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-12T20:28:40.000Z (11 months ago)
- Last Synced: 2025-04-03T20:43:12.313Z (7 months ago)
- Topics: android-app, chatbot, ollama-client
- Language: Kotlin
- Homepage:
- Size: 1.58 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# OCHAT - Ollama Chat App
Ochat is a chat Android app client for Ollama AI. Ochat supports chats with different Ollama models>[!NOTE]
>Ochat doesn't provide an integrated ollama API server. You need to deploy Ollama server by your way to use Ochat## 📱 Requirements
To use this app you must need
- Android 8.0 or upper
- Ollama API service access## 🔨 Building
To build the project, clone it using `git clone https://github.com/Asierso/ochat-app` and open it in Android Studio
- Project is prepared to compile with SDK 33 (Android 13)## 🐳 Deploy Ollama in Docker
To deploy Ollama to use with ochat is better to use Docker containers. You can check the steps [here](https://hub.docker.com/r/ollama/ollama)## 💻 Deploy Ollama in Termux
If you want to run ochat-app without having Ollama deployed by yourself, you can do it using Termux in your mobile phone downloading any linux distro>[!WARNING]
>Models performance maybe decrease using Termux instead of containers- Download Termux app on your mobile phone [here](https://github.com/termux/termux-app)
- Open the app and run the following command (You will need to grand storage permissions to Termux): `pkg update ; pkg install wget -y ; wget https://raw.githubusercontent.com/wahasa/Debian/main/Install/debian12.sh ; chmod +x debian12.sh ; ./debian12.sh`
- Run in your Termux bash `debian`
- Then, execute `apt update ; apt install curl ca-certificates -y && curl -fsSL https://ollama.com/install.sh | sh` and wait few minutesNow Ollama should be installed and functional. Start the service with `ollama serve` inside Debian
>[!TIP]
>Optionally you can run `echo "ollama serve &" >> .bashrc` to set up ollama only with entering Termux and execute `debian`You can download models using `ollama pull `. To access it from the app, use these settings:
- IP `127.0.0.1`
- Port `11434`