An open API service indexing awesome lists of open source software.

https://github.com/stellarbear/whisper.cpp.docker

run whisper.cpp in docker
https://github.com/stellarbear/whisper.cpp.docker

docker docker-compose gpu whisper-cpp

Last synced: about 1 year ago
JSON representation

run whisper.cpp in docker

Awesome Lists containing this project

README

          

Run the [whisper.cpp](https://github.com/ggerganov/) in a Docker container with GPU support.

## TLDR
```
docker compose up
```
or
```
MODEL=large-v2 LANGUAGE=ru docker compose up
```

## Step by step
### 1. Build CUDA image (single run)
```
docker compose build --progress=plain
```

### 2. Download models (single run)
You may want to do it manually in order to see the progress
```
./models/download.sh large-v2
```
This script is a plain copy of [download-ggml-model.sh](https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh).
You may find additional information and configurations [here](https://github.com/ggerganov/whisper.cpp/tree/master/models)

### 3. Prepare your files
Place all the files in the ```./volume/input/``` directory

### 4. Run the docker compose
```
docker compose up
```
Configure defaults
```
MODEL=large-v2 LANGUAGE=ru docker compose up
MODEL=large-v3 LANGUAGE=ru docker compose up
MODEL=large-v3-turbo LANGUAGE=ru docker compose up
```
| Argument | Values | Defaults |
| -------- | ------- |------- |
| model | base, medium, large, [other options](https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh#L25) | large-v2
| language | rn, ru, fr, etc. (depends on the model) | ru

### 5. Result
You can find the result in the ```./volume/output/``` directory