{"id":19079689,"url":"https://github.com/devzhk/cgds-package","last_synced_at":"2025-04-30T05:44:04.913Z","repository":{"id":57417083,"uuid":"275386738","full_name":"devzhk/cgds-package","owner":"devzhk","description":"Package for CGD and ACGD optimizers","archived":false,"fork":false,"pushed_at":"2022-08-11T21:49:19.000Z","size":52,"stargazers_count":20,"open_issues_count":2,"forks_count":4,"subscribers_count":4,"default_branch":"master","last_synced_at":"2025-04-30T05:43:56.544Z","etag":null,"topics":["optimizers","pytorch"],"latest_commit_sha":null,"homepage":"https://pypi.org/project/CGDs","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/devzhk.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-06-27T14:15:47.000Z","updated_at":"2024-11-22T10:41:48.000Z","dependencies_parsed_at":"2022-09-13T12:42:57.033Z","dependency_job_id":null,"html_url":"https://github.com/devzhk/cgds-package","commit_stats":null,"previous_names":[],"tags_count":6,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devzhk%2Fcgds-package","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devzhk%2Fcgds-package/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devzhk%2Fcgds-package/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devzhk%2Fcgds-package/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/devzhk","download_url":"https://codeload.github.com/devzhk/cgds-package/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":251651221,"owners_count":21621702,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["optimizers","pytorch"],"created_at":"2024-11-09T02:15:37.783Z","updated_at":"2025-04-30T05:44:04.892Z","avatar_url":"https://github.com/devzhk.png","language":"Python","readme":"# CGDs\n## Overview\n`CGDs` is a package implementing optimization algorithms including three variants of [CGD](https://arxiv.org/abs/1905.12103)  in [Pytorch](https://pytorch.org/) with Hessian vector product and conjugate gradient.  \n`CGDs` is for competitive optimization problem such as generative adversarial networks (GANs) as follows: \n$$\n\\min_{\\mathbf{x}}f(\\mathbf{x}, \\mathbf{y}) \\min_{\\mathbf{y}} g(\\mathbf{x}, \\mathbf{y})\n$$\n\n\n## Installation \n```bash\npip3 install CGDs\n```\nYou can also directly download the `CGDs` directory and copy it to your project.\n\n## Package description\n\nThe `CGDs` package implements the following optimization algorithms with Pytorch:\n\n- `BCGD` : CGD algorithm in [Competitive Gradient Descent](https://arxiv.org/abs/1905.12103).\n- `ACGD` : ACGD algorithm in [Implicit competitive regularization in GANs](https://arxiv.org/abs/1910.05852). \n- `GACGD`: works for general-sum problem \n## How to use\nQuickstart with notebook: [Examples of using ACGD](https://colab.research.google.com/drive/1-52aReaBAPNBtq2NcHxKkVIbdVXdyqtH?usp=sharing). \n\nSimilar to Pytorch package `torch.optim`, using optimizers in `CGDs` has two main steps: construction and update steps. \n### Construction\nTo construct an optimizer, you have to give it two iterables containing the parameters (all should be `Variable`s). \nThen you need to specify the `device`, `learning rate`s. \n\nExample:\n```python\n\nfrom src import CGDs\nimport torch\ndevice = torch.device('cuda:0' if torch.cuda.is_available() else 'cpu')\noptimizer = CGDs.ACGD(max_param=model_G.parameters(), min_params=model_D.parameters(), \n                      lr_max=1e-3, lr_min=1e-3, device=device)\noptimizer = CGDs.BCGD(max_params=[var1, var2], min_params=[var3, var4, var5], \n                      lr_max=0.01, lr_min=0.01, device=device)   \n```\n\n### Update step \n\nBoth two optimizers have `step()` method, which updates the parameters according to their update rules. The function can be called once the computation graph is created. You have to pass in the loss but do not have to compute gradients before `step()` , which is *different* from `torch.optim`.\n\nExample:\n\n```python\nfor data in dataset:\n    optimizer.zero_grad()\n    real_pred = model_D(data)\n    latent = torch.randn((batch_size, latent_dim), device=device)\n    fake_pred = D(G(latent))\n    loss = loss_fn(real_output, fake_output)\n    optimizer.step(loss=loss)\n```\nFor general competitive optimization, two losses should be defined and passed to optimizer.step\n```python\nloss_x = loss_f(x, y)\nloss_y = loss_g(x, y)\noptimizer.step(loss_x, loss_y)\n```\n## Use with Pytorch DistributedDataParallel\n\nFor example, \n```python\nG = DDP(G, device_ids=[rank], broadcast_buffers=False)\nD = DDP(D, device_ids=[rank], broadcast_buffers=False)\ng_reducer = G.reducer\nd_reducer = D.reducer\n\noptimizer = ACGD(max_params=G.parameters(), min_params=D.parameters(), \n                 max_reducer=g_reducer, min_reducer=d_reducer, \n                 lr_max=1e-3, lr_min=1e-3, \n                 tol=1e-4, atol=1e-8)\nfor data in dataloader:\n    real_pred = D(data)\n    latent = torch.randn((batchsize, latent_dim))\n    fake_img = G(latent)\n    fake_pred = D(fake_img)\n    # trigger is used to trigger the comm\n    trigger = real_pred[0, 0] + fake_img[0, 0, 0, 0]\n    loss = loss_fn(real_pred, fake_pred)\n    optimizer.step(loss, trigger=trigger.mean())\n```\n\n## Citation\n\nPlease cite it if you find this code useful. \n\n```latex\n@misc{cgds-package,\n  author = {Hongkai Zheng},\n  title = {CGDs},\n  year = {2020},\n  publisher = {GitHub},\n  journal = {GitHub repository},\n  howpublished = {\\url{https://github.com/devzhk/cgds-package}},\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdevzhk%2Fcgds-package","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdevzhk%2Fcgds-package","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdevzhk%2Fcgds-package/lists"}