Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rh-rad-ai-roadshow/parasol-insurance
https://github.com/rh-rad-ai-roadshow/parasol-insurance
ai java langchain4j quarkus
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/rh-rad-ai-roadshow/parasol-insurance
- Owner: rh-rad-ai-roadshow
- License: mit
- Created: 2024-06-24T21:40:40.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-12-04T23:05:54.000Z (about 2 months ago)
- Last Synced: 2024-12-05T00:18:26.892Z (about 2 months ago)
- Topics: ai, java, langchain4j, quarkus
- Language: TypeScript
- Homepage:
- Size: 4.11 MB
- Stars: 1
- Watchers: 3
- Forks: 9
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Parasol Insurance
A [Quarkus](https://quarkus.io) + [React](https://react.dev/) AI app for managing fictitious insurance claims. Uses [Quarkus Quinoa](https://docs.quarkiverse.io/quarkus-quinoa/dev/index.html) under the covers.
![App](app/src/main/webui/src/app/assets/images/sample.png)
## Pre-requisites
- Java 21 or later -- Get it https://adoptium.net/ or install using your favorite package manager.
- Maven 3.9.6 or later -- Get it https://maven.apache.org/download.cgi or install using your favorite package manager.
- Or just use the embedded [Maven Wrapper](https://maven.apache.org/wrapper)
- An OpenAI-capable LLM inference server. Get one here with [InstructLab](https://github.com/instructlab/instructlab)!## To Configure on InstructLab instance of Red Hat Demo Platform
You can execute [this]([https://gist.githubusercontent.com/jameslabocki/748e191006d0e311dec21c72e95570d1/raw/3c40273c962c3ee598a13dd853b831ac9df884ff/gistfile1.txt](https://raw.githubusercontent.com/jameslabocki/ilabdemo/main/install.sh)) to install the parasol app on the InstructLab instance for Red Hat Demo Platform.
## Configuration
You can change the coordinates (host/port and other stuff) for the LLM and backend in [`app/src/main/resources/application.properties`](app/src/main/resources/application.properties).
## Running
First, get your inference server up and running. For example, with [InstructLab](https://github.com/instructlab/instructlab), the default after running `ilab serve` is that the server is listening on `localhost:8000`. This is the default for this app as well.
Then:
```
cd app;
./mvnw clean quarkus:dev
```
App will open on `http://0.0.0.0:8005`.Open the app, click on a claim, click on the chat app, and start asking questions. The context of the claim is sent to the LLM along with your Query, and the response is shown in the chat (it may take time depending on your machine's performance).