Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/pamaldi/adso-chat

ADSO Chat is an interactive, Docker-based chat app with advanced language processing. It features a robust backend, Ollama for language tasks, and a user-friendly frontend, all containerized using Docker Compose. Key Features: Three services: backend, Ollama, frontend UI Advanced natural language processing Easy setup and deployment Customizable
https://github.com/pamaldi/adso-chat

chat chatbot llm ollama quarkus

Last synced: 8 days ago
JSON representation

ADSO Chat is an interactive, Docker-based chat app with advanced language processing. It features a robust backend, Ollama for language tasks, and a user-friendly frontend, all containerized using Docker Compose. Key Features: Three services: backend, Ollama, frontend UI Advanced natural language processing Easy setup and deployment Customizable

Awesome Lists containing this project

README

        

# ADSO Chat Application

## Overview

ADSO Chat is a containerized application that leverages Docker Compose to manage three main services: a backend server, an Ollama language processing service, and a frontend user interface. This application is designed to provide an interactive chat experience with advanced language processing capabilities.

## Services

1. **ADSO Chat Backend**: Handles server-side logic and API requests.
2. **ADSO Chat Ollama**: Provides language model processing for enhanced chat functionality.
3. **ADSO UI**: Delivers the user interface for interacting with the chat application.

## Prerequisites

- Docker
- Docker Compose

## Quick Start

1. Clone the repository:
```bash
git clone
cd

2. Build and start the services:
```bash
docker-compose up --build -d

3. Access the application:

* Backend: http://localhost:8080
* Ollama: http://localhost:11434
* UI: http://localhost

4. Usage
* View Logs
```bash
`docker-compose logs -f`
* Stop Services
```bash
`docker-compose down`
* Rebuild and Restart Services
```bash
`docker-compose up -d --build`

5. Configuration
The application uses a custom bridge network named mynetwork for inter-service communication. Each service is configured with specific options for optimal performance and reliability.
Data Persistence
The Ollama service uses a bind mount to persist data:

Local ./data directory is mounted to /data in the Ollama container.

6. Keycloak
http://localhost:8081/admin/master/console/
http://localhost:8081/realms/adso/protocol/openid-connect/auth?client_id=adso-user&redirect_uri=http://localhost:80&response_type=code&scope=openid

# Other
Build a single service
```bash
docker-compose up --build -d adso-ui
```

## About Ollama

### What is Ollama?
Ollama is a service used to handle language models and facilitate advanced language processing tasks. It can be integrated into various applications to provide functionalities such as language translation, text generation, and more.

### Usage in ADSO Chat
In the context of the ADSO Chat application, Ollama is used to process and generate responses based on the inputs it receives. This enhances the chatbot's ability to understand and generate natural language.

### Configuration
Ensure that the Ollama endpoint is correctly configured in your application settings. This typically involves specifying the URL or IP address where the Ollama service is running.

### Data Volume
The configuration includes a bind mount for the `./data` directory to `/data` within the Ollama container. This is used to persist data across container restarts and can be useful for storing model files or other relevant data.

### Network Configuration
Ollama is connected to the same custom bridge network (`mynetwork`) as the other services, allowing it to communicate seamlessly with the backend and UI components of the ADSO Chat application.