https://github.com/oalles/ollama-java
A java client for Ollama
https://github.com/oalles/ollama-java
client java language-model llm llm-inference localdevelopment ollama spring-boot
Last synced: 5 months ago
JSON representation
A java client for Ollama
- Host: GitHub
- URL: https://github.com/oalles/ollama-java
- Owner: oalles
- License: unlicense
- Created: 2023-11-07T17:14:57.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2025-03-24T08:16:25.000Z (7 months ago)
- Last Synced: 2025-03-31T23:58:00.864Z (6 months ago)
- Topics: client, java, language-model, llm, llm-inference, localdevelopment, ollama, spring-boot
- Language: Java
- Homepage:
- Size: 3.4 MB
- Stars: 22
- Watchers: 2
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# OLLAMA Java Client
## Modules
* [ollama-java-client](ollama-java-client) - The Ollama Java Client library
* [ollama-java-client-starter](ollama-java-client-starter) - The Ollama Java Client Spring Boot 3 Starter
* [spring-boot-ollama-sample](spring-boot-ollama-sample) - A sample Spring Boot 3 application using the Ollama Java
Client Spring Boot 3 Starter## Model Description
### OllamaService
The [OllamaService](src/main/java/es/omarall/ollama/OllamaService.java) interface provide the interaction with the
ollama web service.```java
public interface OllamaService {
CompletionResponse completion(CompletionRequest completionRequest);TagsResponse getTags();
ShowResponse show(ShowRequest showRequest);
void copy(CopyRequest copyRequest);
void delete(String modelName);
void streamingCompletion(CompletionRequest completionRequest, StreamResponseProcessor handler);
EmbeddingResponse embed(EmbeddingRequest embeddingRequest);
}
```### OllamaServiceFactory
The [OllamaServiceFactory](src/main/java/es/omarall/ollama/OllamaServiceFactory.java) class is responsible for creating
instances of the `OllamaService`. It provides builder methods
to create an instance of the service with the specified configuration.```java
public class OllamaServiceFactory {
public static OllamaService create(OllamaProperties properties) { // ...
}public static OllamaService create(OllamaProperties properties, Gson gson) { // ...
}
}
```### StreamResponseProcessor
The [StreamResponseProcessor](src/main/java/es/omarall/ollama/StreamResponseProcessor.java) interface provides methods
to process streaming completion responses.```java
public interface StreamResponseProcessor {
void processStreamItem(T item);void processCompletion(T fullResponse);
void processError(Throwable throwable);
}
```### How to use
Just create an instance of the `OllamaService` with the factory and use it.
Have a look at [here](./ollama-java-client/src/test/java/es/omarall/ollama/HowToUse.java)
or have a look at the [spring-boot-ollama-sample](spring-boot-ollama-sample) project.
## Useful Resources
### API DOC
https://github.com/jmorganca/ollama/blob/main/docs/api.md
### Linux install
https://github.com/jmorganca/ollama/blob/main/docs/linux.md
```bash
$ curl https://ollama.ai/install.sh | sh
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> NVIDIA GPU installed.
```#### Test URL
```text
# open http://localhost:11434/
# or via curl
$ curl http://localhost:11434/api/tags
```#### Instal Mistral 7B model
```bash
$ ollama run mistral
```#### Viewing logs
To view logs of Ollama running as a startup service, run:
``` bash
$ journalctl -u ollama
```### Uninstall
Remove the ollama service:
```bash
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
```Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):
```bash
sudo rm $(which ollama)
```Remove the downloaded models and Ollama service user:
```bash
sudo rm -r /usr/share/ollama
sudo userdel ollama
```