Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/cosmic-utils/cosmic-ext-applet-ollama

Ollama applet for COSMIC Desktop
https://github.com/cosmic-utils/cosmic-ext-applet-ollama

Last synced: 3 days ago
JSON representation

Ollama applet for COSMIC Desktop

Awesome Lists containing this project

README

        


Ollama applet for COSMIC Desktop





chat
settings

Before using this applet, you must have Ollama installed on your system. To do this, run this in your terminal:

```sh
curl -fsSL https://ollama.com/install.sh | sh
```

Source: [Ollama Github](https://github.com/ollama/ollama?tab=readme-ov-file#linux)

After installing Ollama. Pull some models, you would like to use with chat, for example

```sh
ollama pull llama3
```

More models you can find in library: https://ollama.com/library

# Installing this applet

Clone the repository, and use [just](https://github.com/casey/just)

If you don't have `just` installed, it is available in PopOS repository, so you can install it with `apt`

```sh
sudo apt install just
```

Now you can clone repo and install applet.

```sh
git clone https://github.com/elevenhsoft/cosmic-ext-applet-ollama.git
cd cosmic-ext-applet-ollama
```

### Building

Run just:

```sh
just
```

### Installing

```sh
sudo just install
```

Done

From now, you will be able to add applet to your desktop panel/dock and chat with different models in real time :)

Cheers!

## Known wgpu issue

There are currently some rendering issues with the `wgpu` libcosmic features in some (older?) gpus.
This doesn't affect Ollama, only the applet.
If you are affected by this, you can build and install it with this feature disabled:

```sh
just build-no-wgpu
sudo just install
```