{"id":19066161,"url":"https://github.com/epfml/powergossip","last_synced_at":"2025-08-12T09:04:16.512Z","repository":{"id":109046878,"uuid":"284922280","full_name":"epfml/powergossip","owner":"epfml","description":"Code for \"Practical Low-Rank Communication Compression in Decentralized Deep Learning\"","archived":false,"fork":false,"pushed_at":"2020-08-04T09:16:42.000Z","size":52,"stargazers_count":16,"open_issues_count":0,"forks_count":1,"subscribers_count":6,"default_branch":"master","last_synced_at":"2025-04-28T12:39:19.511Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2008.01425","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/epfml.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2020-08-04T08:30:03.000Z","updated_at":"2025-01-18T09:14:03.000Z","dependencies_parsed_at":"2023-04-05T23:32:56.267Z","dependency_job_id":null,"html_url":"https://github.com/epfml/powergossip","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/epfml/powergossip","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fpowergossip","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fpowergossip/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fpowergossip/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fpowergossip/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/epfml","download_url":"https://codeload.github.com/epfml/powergossip/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fpowergossip/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":270032623,"owners_count":24515322,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-12T02:00:09.011Z","response_time":80,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-09T00:54:59.111Z","updated_at":"2025-08-12T09:04:16.455Z","avatar_url":"https://github.com/epfml.png","language":"Python","readme":"# PowerGossip: Practical Low-Rank Communication Compression in Decentralized Deep Learning\n\nAbstract:\n\nLossy gradient compression has become a practical tool to overcome the communication bottleneck in centrally coordinated distributed training of machine learning models.\nHowever, algorithms for decentralized training with compressed communication over arbitrary connected networks have been more complicated, requiring additional memory and hyperparameters.\nWe introduce a simple algorithm that directly compresses the model differences between neighboring workers using low-rank linear compressors applied on model differences.\nInspired by the PowerSGD algorithm for centralized deep learning, this algorithm uses power iteration steps to maximize the information transferred per bit.\nWe prove that our method requires no additional hyperparameters, converges faster than prior methods, and is asymptotically independent of both the network and the compression.\nOut of the box, these compressors perform on par with state-of-the-art tuned compression algorithms in a series of deep learning benchmarks.\n\n# Code\n\n-   [train.py](train.py) is the entrypoint for deep learning experiments.\n-   [gossip_run.py](gossip_run.py) is the entrypoint for consensus experiments.\n-   [algorithms.py](algorithms.py) implements decentralized consensus and learning algorithms.\n-   [compressors.py](compressors.py) implements compressors for [ChocoSGD](https://github.com/epfml/ChocoSGD) and `DeepSqueeze`.\n-   Experiment scheduling code for our experiments is listed under [experiments](experiments/).\n\n# Running and configuring experiments\n\nYou can override the global variables `config`, `log_metric` and `output_dir` from [train.py](train.py) before running `train.main()`:\n\n```python\nimport train\n\n# Configure the worker\ntrain.config[\"n_workers\"] = 4\ntrain.config[\"rank\"] = 0\n\ntrain.output_dir = \"whatever you like\"\ntrain.log_info = your_function_pointer\ntrain.log_metric = your_metric_function_pointer\n\ntrain.main()\n```\n\n# Distributed training\n\nWe use a separate process per “worker”.\nWe use MPI (`mpirun`) to create these processes.\n\n# Environment\n\nWe provide a `setup.sh` file under [environment](environment)\nthat describes our computation environment.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fpowergossip","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fepfml%2Fpowergossip","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fpowergossip/lists"}