Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dstackai/dstack
dstack is an open-source engine that automates infrastructure provisioning on any cloud — for development, training, and deployment of AI models. Discord: https://discord.gg/u8SmfwPpMd
https://github.com/dstackai/dstack
aws azure cloud gcp gpu llms machine-learning orchestration python
Last synced: 3 months ago
JSON representation
dstack is an open-source engine that automates infrastructure provisioning on any cloud — for development, training, and deployment of AI models. Discord: https://discord.gg/u8SmfwPpMd
- Host: GitHub
- URL: https://github.com/dstackai/dstack
- Owner: dstackai
- License: mpl-2.0
- Created: 2022-01-04T10:29:46.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2024-03-08T20:38:30.000Z (3 months ago)
- Last Synced: 2024-03-09T12:54:58.450Z (3 months ago)
- Topics: aws, azure, cloud, gcp, gpu, llms, machine-learning, orchestration, python
- Language: Python
- Homepage: https://dstack.ai
- Size: 94.2 MB
- Stars: 1,024
- Watchers: 13
- Forks: 70
- Open Issues: 49
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
Lists
- awesome-mlops - dstack - An open-core tool to automate data and training workflows. (Workflow Tools)
- awesome-llm-eval - Dstack
- awesome-stars - dstackai/dstack - An open-source container orchestration engine for running AI workloads in any cloud or data center. https://discord.gg/u8SmfwPpMd (Python)
- awesome-llmops - Dstack - effective LLM development in any cloud (AWS, GCP, Azure, Lambda, etc). | ![GitHub Badge](https://img.shields.io/github/stars/dstackai/dstack.svg?style=flat-square) | (LLMOps / Observability)
README
Orchestrate GPU workloads effortlessly on any cloud[![Last commit](https://img.shields.io/github/last-commit/dstackai/dstack?style=flat-square)](https://github.com/dstackai/dstack/commits/)
[![PyPI - License](https://img.shields.io/pypi/l/dstack?style=flat-square&color=blue)](https://github.com/dstackai/dstack/blob/master/LICENSE.md)`dstack` is an open-source engine for running GPU workloads on any cloud.
It works with a wide range of cloud GPU providers (AWS, GCP, Azure, Lambda, TensorDock, Vast.ai, etc.)
as well as on-premises servers.## Latest news ✨
- [2024/03] [dstack 0.16.1: Improvements to `dstack pool` and bug-fixes](https://dstack.ai/changelog/0.16.1/) (Release)
- [2024/02] [dstack 0.16.0: Pools](https://dstack.ai/changelog/0.16.0/) (Release)
- [2024/02] [dstack 0.15.1: Kubernetes integration](https://dstack.ai/changelog/0.15.1/) (Release)
- [2024/01] [dstack 0.15.0: Resources, authentication, and more](https://dstack.ai/changelog/0.15.0/) (Release)
- [2024/01] [dstack 0.14.0: OpenAI-compatible endpoints](https://dstack.ai/changelog/0.14.0/) (Release)
- [2023/12] [dstack 0.13.0: Disk size, CUDA 12.1, Mixtral, and more](https://dstack.ai/changelog/0.13.0/) (Release)## Installation
Before using `dstack` through CLI or API, set up a `dstack` server.
### Install the server
The easiest way to install the server, is via `pip`:```shell
pip install "dstack[all]" -U
```### Configure backends
If you have default AWS, GCP, or Azure credentials on your machine, the `dstack` server will pick them up automatically.
Otherwise, you need to manually specify the cloud credentials in `~/.dstack/server/config.yml`.
For further details on setting up the server, refer to [installation](https://dstack.ai/docs/installation/).
### Start the server
To start the server, use the `dstack server` command:
```shell
$ dstack serverApplying ~/.dstack/server/config.yml...
The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/
```> **Note**
> It's also possible to run the server via [Docker](https://hub.docker.com/r/dstackai/dstack).### CLI & API
Once the server is up, you can use either `dstack`'s CLI or API to run workloads.
Below is a live demo of how it works with the CLI.### Dev environments
You specify the required environment and resources, then run it. dstack provisions the dev
environment in the cloud and enables access via your desktop IDE.### Tasks
Tasks allow for convenient scheduling of any kind of batch jobs, such as training, fine-tuning,
or data processing, as well as running web applications.Specify the environment and resources, then run it. dstack executes the task in the
cloud, enabling port forwarding to your local machine for convenient access.### Services
Services make it very easy to deploy any kind of model or web application as public endpoints.
Use any serving frameworks and specify required resources. dstack deploys it in the configured
backend, handles authentication, and provides an OpenAI-compatible interface if needed.### Pools
Pools simplify managing the lifecycle of cloud instances and enable their efficient reuse across runs.
You can have instances provisioned in the cloud automatically, or add them manually, configuring the required resources,
idle duration, etc.## Examples
Here are some featured examples:
- [TGI](https://dstack.ai/examples/tgi/)
- [vLLM](https://dstack.ai/examples/vllm/)
- [Ollama](https://dstack.ai/examples/ollama/)
- [SDXL](https://dstack.ai/examples/sdxl/)
- [QLoRA](https://dstack.ai/examples/qlora/)Browse [dstack.ai/examples](https://dstack.ai/examples) for more examples.
## More information
For additional information and examples, see the following links:
- [Docs](https://dstack.ai/docs)
- [Discord](https://discord.gg/u8SmfwPpMd)## Licence
[Mozilla Public License 2.0](LICENSE.md)