{"id":13573004,"url":"https://github.com/alibaba/easydist","last_synced_at":"2025-07-24T00:37:17.212Z","repository":{"id":193224867,"uuid":"688342370","full_name":"alibaba/easydist","owner":"alibaba","description":"Automated Parallelization System and Infrastructure for Multiple Ecosystems","archived":false,"fork":false,"pushed_at":"2024-11-19T12:24:43.000Z","size":535,"stargazers_count":79,"open_issues_count":0,"forks_count":6,"subscribers_count":6,"default_branch":"main","last_synced_at":"2025-07-04T21:06:46.072Z","etag":null,"topics":["automatic-parallelization","deep-learning","diffusion-models","distributed-computing","graph-neural-networks","high-performance-computing","large-language-models","llama","llm","machine-learning"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alibaba.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-09-07T06:44:32.000Z","updated_at":"2025-05-23T07:04:15.000Z","dependencies_parsed_at":"2023-12-19T08:35:21.550Z","dependency_job_id":"f5142dfe-e2c9-42a6-901e-20390f1d1865","html_url":"https://github.com/alibaba/easydist","commit_stats":{"total_commits":127,"total_committers":11,"mean_commits":"11.545454545454545","dds":0.5669291338582677,"last_synced_commit":"4ced4cd7ca7a39722670beae16332426d5430238"},"previous_names":["alibaba/easydist"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/alibaba/easydist","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alibaba%2Feasydist","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alibaba%2Feasydist/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alibaba%2Feasydist/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alibaba%2Feasydist/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alibaba","download_url":"https://codeload.github.com/alibaba/easydist/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alibaba%2Feasydist/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":264502991,"owners_count":23618673,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["automatic-parallelization","deep-learning","diffusion-models","distributed-computing","graph-neural-networks","high-performance-computing","large-language-models","llama","llm","machine-learning"],"created_at":"2024-08-01T15:00:26.353Z","updated_at":"2025-07-24T00:37:17.179Z","avatar_url":"https://github.com/alibaba.png","language":"Python","readme":"# EasyDist\n\nEasyDist is an automated parallelization system and infrastructure designed for multiple ecosystems, offering the following key features:\n\n- **Usability**. With EasyDist, parallelizing your training or inference code to a larger scale becomes effortless with just a single line of change.\n\n- **Ecological Compatibility**. EasyDist serves as a centralized source of truth for SPMD rules at the operator-level for various machine learning frameworks. Currently, EasyDist currently supports PyTorch, Jax natively, and the TVM Tensor Expression operator for SPMD rules.\n\n- **Infrastructure**. EasyDist decouples auto-parallel algorithms from specific machine learning frameworks and IRs. This design choice allows for the development and benchmarking of different auto-parallel algorithms in a more flexible manner, leveraging the capabilities and abstractions provided by EasyDist.\n\n## One Line of Code for Parallelism\n\nTo parallelize your training loop using EasyDist, you can use the `easydist_compile` decorator. Here's an example of how it can be used with PyTorch:\n\n```python\n@easydist_compile()\ndef train_step(net, optimizer, inputs, labels):\n\n    outputs = net(inputs)\n    loss = nn.CrossEntropyLoss()(outputs, labels)\n    loss.backward()\n\n    optimizer.step()\n    optimizer.zero_grad()\n\n    return loss\n```\n\nThis one-line decorator parallelizes the training step. You can find more examples in the [`./examples/`](./examples/) directory.\n\n## Overview\n\nEasyDist introduces the concept of MetaOp and MetaIR to decouple automatic parallelization methods from specific intermediate representations (IR) and frameworks. Additionally, it presents the ShardCombine Algorithm, which defines operator Single-Program, Multiple-Data (SPMD) sharding rules without requiring manual annotations. The architecture of EasyDist is as follows:\n\n\u003cbr\u003e\u003cdiv id=\"top\" align=\"center\"\u003e\n\u003cimg src=\"./assets/arch.svg\" width=\"650\"\u003e\n\u003c/div\u003e\u003cbr\u003e\n\n## Installation\n\nTo install EasyDist, you can use pip and install from PyPI:\n\n```shell\n# For PyTorch users\npip install pai-easydist[torch]\n\n# For Jax users\npip install pai-easydist[jax] -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html\n```\n\nIf you prefer to install EasyDist from source, you can clone the GitHub repository and then install it with the appropriate extras:\n\n```shell\ngit clone https://github.com/alibaba/easydist.git \u0026\u0026 cd easydist\n\n# EasyDist with PyTorch installation\npip install -e '.[torch]'\n\n# EasyDist with Jax installation\npip install -e '.[jax]' -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html\n```\n\n## Contributing\n\nSee CONTRIBUTING.md for details.\n\n## Contributors\n\nEasyDist is developed by Alibaba Group and NUS HPC-AI Lab. This work is supported by [Alibaba Innovative Research(AIR)](https://damo.alibaba.com/air/).\n\n## License\n\nEasyDist is licensed under the Apache License (Version 2.0). See LICENSE file.\nThis product contains some third-party testcases under other open source licenses. \nSee the NOTICE file for more information.\n\n","funding_links":[],"categories":["Open Source Projects"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falibaba%2Feasydist","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falibaba%2Feasydist","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falibaba%2Feasydist/lists"}