{"id":16488493,"url":"https://github.com/stellarbear/whisper.cpp.docker","last_synced_at":"2026-04-09T16:44:03.455Z","repository":{"id":215237619,"uuid":"738438525","full_name":"stellarbear/whisper.cpp.docker","owner":"stellarbear","description":"run whisper.cpp in docker","archived":false,"fork":false,"pushed_at":"2024-10-14T09:12:46.000Z","size":12,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-11T17:47:13.479Z","etag":null,"topics":["docker","docker-compose","gpu","whisper-cpp"],"latest_commit_sha":null,"homepage":"","language":"Shell","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/stellarbear.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-01-03T08:23:45.000Z","updated_at":"2024-10-14T09:12:50.000Z","dependencies_parsed_at":"2025-01-11T17:44:23.368Z","dependency_job_id":"f02a0c83-9e11-4ede-abcc-3fa2147d2662","html_url":"https://github.com/stellarbear/whisper.cpp.docker","commit_stats":null,"previous_names":["stellarbear/whisper.cpp.docker"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/stellarbear%2Fwhisper.cpp.docker","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/stellarbear%2Fwhisper.cpp.docker/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/stellarbear%2Fwhisper.cpp.docker/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/stellarbear%2Fwhisper.cpp.docker/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/stellarbear","download_url":"https://codeload.github.com/stellarbear/whisper.cpp.docker/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241306021,"owners_count":19941243,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["docker","docker-compose","gpu","whisper-cpp"],"created_at":"2024-10-11T13:39:06.286Z","updated_at":"2026-04-09T16:44:03.390Z","avatar_url":"https://github.com/stellarbear.png","language":"Shell","readme":"Run the [whisper.cpp](https://github.com/ggerganov/) in a Docker container with GPU support.\n\n## TLDR\n```\ndocker compose up\n```\nor\n```\nMODEL=large-v2 LANGUAGE=ru docker compose up\n```\n\n## Step by step\n### 1. Build CUDA image (single run)\n```\ndocker compose build --progress=plain\n```\n\n### 2. Download models (single run)\nYou may want to do it manually in order to see the progress\n```\n./models/download.sh large-v2 \n```\nThis script is a plain copy of [download-ggml-model.sh](https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh).\nYou may find additional information and configurations [here](https://github.com/ggerganov/whisper.cpp/tree/master/models) \n\n### 3. Prepare your files\nPlace all the files in the ```./volume/input/``` directory\n\n### 4. Run the docker compose\n```\ndocker compose up\n```\nConfigure defaults\n```\nMODEL=large-v2 LANGUAGE=ru docker compose up\nMODEL=large-v3 LANGUAGE=ru docker compose up\nMODEL=large-v3-turbo LANGUAGE=ru docker compose up\n```\n| Argument    | Values | Defaults |\n| -------- | ------- |------- |\n| model  | base, medium, large, [other options](https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh#L25)   |   large-v2 \n| language | rn, ru, fr, etc. (depends on the model)     |  ru\n\n### 5. Result\nYou can find the result in the ```./volume/output/``` directory\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fstellarbear%2Fwhisper.cpp.docker","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fstellarbear%2Fwhisper.cpp.docker","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fstellarbear%2Fwhisper.cpp.docker/lists"}