https://github.com/dstackai/dstack
dstack is an open-source alternative to Kubernetes and Slurm, designed to simplify GPU allocation and AI workload orchestration for ML teams across top clouds, on-prem clusters, and accelerators.
https://github.com/dstackai/dstack
amd aws azure cloud docker fine-tuning gcp gpu inference k8s kubernetes llms machine-learning nvidia orchestration python slurm training
Last synced: 16 days ago
JSON representation
dstack is an open-source alternative to Kubernetes and Slurm, designed to simplify GPU allocation and AI workload orchestration for ML teams across top clouds, on-prem clusters, and accelerators.
- Host: GitHub
- URL: https://github.com/dstackai/dstack
- Owner: dstackai
- License: mpl-2.0
- Created: 2022-01-04T10:29:46.000Z (almost 4 years ago)
- Default Branch: master
- Last Pushed: 2025-04-26T20:46:07.000Z (7 months ago)
- Last Synced: 2025-04-27T04:42:50.956Z (7 months ago)
- Topics: amd, aws, azure, cloud, docker, fine-tuning, gcp, gpu, inference, k8s, kubernetes, llms, machine-learning, nvidia, orchestration, python, slurm, training
- Language: Python
- Homepage: https://dstack.ai/
- Size: 114 MB
- Stars: 1,764
- Watchers: 13
- Forks: 168
- Open Issues: 113
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-llmops - Dstack - effective LLM development in any cloud (AWS, GCP, Azure, Lambda, etc). |  | (LLMOps / Observability)
- awesome-llm-eval - Dstack
- awesome-production-machine-learning - dstack - dstack is an open-source container orchestrator that simplifies workload orchestration and drives GPU utilization for ML teams. (Model Training and Orchestration)
README
[](https://github.com/dstackai/dstack/commits/)
[](https://github.com/dstackai/dstack/blob/master/LICENSE.md)
[](https://discord.gg/u8SmfwPpMd)
`dstack` is a unified control plane for GPU provisioning and orchestration that works with any GPU cloud, Kubernetes, or on-prem clusters.
It streamlines development, training, and inference, and is compatible with any hardware, open-source tools, and frameworks.
#### Hardware
`dstack` supports `NVIDIA`, `AMD`, `Google TPU`, `Intel Gaudi`, and `Tenstorrent` accelerators out of the box.
## Latest news ✨
- [2025/10] [dstack 0.19.31: Kubernetes, GCP A4 spot](https://github.com/dstackai/dstack/releases/tag/0.19.31)
- [2025/08] [dstack 0.19.26: Repos](https://github.com/dstackai/dstack/releases/tag/0.19.26)
- [2025/08] [dstack 0.19.22: Service probes, GPU health-checks, Tenstorrent Galaxy](https://github.com/dstackai/dstack/releases/tag/0.19.22)
- [2025/07] [dstack 0.19.21: Scheduled tasks](https://github.com/dstackai/dstack/releases/tag/0.19.21)
- [2025/07] [dstack 0.19.17: Secrets, Files, Rolling deployment](https://github.com/dstackai/dstack/releases/tag/0.19.17)
- [2025/06] [dstack 0.19.16: Docker in Docker](https://github.com/dstackai/dstack/releases/tag/0.19.16)
- [2025/06] [dstack 0.19.13: Default images with InfiniBand support](https://github.com/dstackai/dstack/releases/tag/0.19.13)
## How does it work?

### Installation
> Before using `dstack` through CLI or API, set up a `dstack` server. If you already have a running `dstack` server, you only need to [set up the CLI](#set-up-the-cli).
#### Set up the server
##### Configure backends
To orchestrate compute across cloud providers or existing Kubernetes clusters, you need to configure backends.
Backends can be set up in `~/.dstack/server/config.yml` or through the [project settings page](https://dstack.ai/docs/concepts/projects#backends) in the UI.
For more details, see [Backends](https://dstack.ai/docs/concepts/backends).
> When using `dstack` with on-prem servers, backend configuration isn’t required. Simply create [SSH fleets](https://dstack.ai/docs/concepts/fleets#ssh) once the server is up.
##### Start the server
You can install the server on Linux, macOS, and Windows (via WSL 2). It requires Git and
OpenSSH.
##### uv
```shell
$ uv tool install "dstack[all]" -U
```
##### pip
```shell
$ pip install "dstack[all]" -U
```
Once it's installed, go ahead and start the server.
```shell
$ dstack server
Applying ~/.dstack/server/config.yml...
The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/
```
> For more details on server configuration options, see the
[Server deployment](https://dstack.ai/docs/guides/server-deployment) guide.
Set up the CLI
#### Set up the CLI
Once the server is up, you can access it via the `dstack` CLI.
The CLI can be installed on Linux, macOS, and Windows. It requires Git and OpenSSH.
##### uv
```shell
$ uv tool install dstack -U
```
##### pip
```shell
$ pip install dstack -U
```
To point the CLI to the `dstack` server, configure it
with the server address, user token, and project name:
```shell
$ dstack project add \
--name main \
--url http://127.0.0.1:3000 \
--token bbae0f28-d3dd-4820-bf61-8f4bb40815da
Configuration is updated at ~/.dstack/config.yml
```
### Define configurations
`dstack` supports the following configurations:
* [Dev environments](https://dstack.ai/docs/dev-environments) — for interactive development using a desktop IDE
* [Tasks](https://dstack.ai/docs/tasks) — for scheduling jobs (incl. distributed jobs) or running web apps
* [Services](https://dstack.ai/docs/services) — for deployment of models and web apps (with auto-scaling and authorization)
* [Fleets](https://dstack.ai/docs/fleets) — for managing cloud and on-prem clusters
* [Volumes](https://dstack.ai/docs/concepts/volumes) — for managing persisted volumes
* [Gateways](https://dstack.ai/docs/concepts/gateways) — for configuring the ingress traffic and public endpoints
Configuration can be defined as YAML files within your repo.
### Apply configurations
Apply the configuration either via the `dstack apply` CLI command or through a programmatic API.
`dstack` automatically manages provisioning, job queuing, auto-scaling, networking, volumes, run failures,
out-of-capacity errors, port-forwarding, and more — across clouds and on-prem clusters.
## Useful links
For additional information, see the following links:
* [Docs](https://dstack.ai/docs)
* [Examples](https://dstack.ai/examples)
* [Discord](https://discord.gg/u8SmfwPpMd)
## Contributing
You're very welcome to contribute to `dstack`.
Learn more about how to contribute to the project at [CONTRIBUTING.md](CONTRIBUTING.md).
## License
[Mozilla Public License 2.0](LICENSE.md)