{"id":19066178,"url":"https://github.com/epfml/denseformer","last_synced_at":"2025-04-28T12:27:39.297Z","repository":{"id":222320621,"uuid":"752450375","full_name":"epfml/DenseFormer","owner":"epfml","description":null,"archived":false,"fork":false,"pushed_at":"2024-03-26T14:38:25.000Z","size":40,"stargazers_count":65,"open_issues_count":0,"forks_count":5,"subscribers_count":5,"default_branch":"main","last_synced_at":"2024-03-26T15:50:17.056Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/epfml.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2024-02-03T21:53:48.000Z","updated_at":"2024-03-26T11:14:30.000Z","dependencies_parsed_at":"2024-02-13T16:13:19.688Z","dependency_job_id":"17032f84-352e-4f7c-905e-7ff39342b57d","html_url":"https://github.com/epfml/DenseFormer","commit_stats":null,"previous_names":["epfml/denseformer"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2FDenseFormer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2FDenseFormer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2FDenseFormer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2FDenseFormer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/epfml","download_url":"https://codeload.github.com/epfml/DenseFormer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":223772219,"owners_count":17199977,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-09T00:55:02.473Z","updated_at":"2024-11-09T00:55:03.198Z","avatar_url":"https://github.com/epfml.png","language":"Python","readme":"# DenseFormer\n\nThis repository contains a helpful python package to implement DenseFormers as described in the paper: Enhancing Information Flow in Transformers via Depth Weighted Averaging. \n\n## Installation\n\nThe code is arranged as a `denseformer` package. To install the `denseformer` package, run:\n\n```\npip install -e .\n```\n\n## Usage\n\nThe following shows how to transform a simplified Transformer class into a DenseFormer in only 3 steps:\n\n```python\nimport torch\nfrom denseformer import DWAModules \n\nclass DenseFormer(torch.nn.Module):\n\n  def __init__(self, config):\n    super().__init__()\n    self.config = config\n    self.dwa_modules = DWAModules(config.n_blocks, config.dilation, config.dwa_period) # Step 1\n    self.wte = torch.nn.Embedding(config.vocab_size, config.n_embd)\n    self.blocks = torch.nn.ModuleList([Block(config) for _ in range(config.n_blocks)])\n    self.ln_f = LayerNorm(config.n_embd, bias=config.bias)\n    self.lm_head = torch.nn.Linear(config.n_embd, config.vocab_size, bias=False)\n    self.transformer.wte.weight = self.lm_head.weight\n\n  def forward(self, idx):\n    x = self.wte(idx) \n    self.dwa_modules.init_accumulators(x) # Step 2\n    for i in range(self.config.n_blocks):\n      x = self.blocks[i](x)\n      x = self.dwa_modules(x, block_idx=i) # Step 3\n    x = self.ln_f(x)\n    logits = self.lm_head(x)\n    return logits\n```\n\n## Warning\n\nThe module use `nn.Linear` submodules for the DWA weights. If you force some initialization on all the `nn.Linear` submodules you might break the DWA initialization. Simply call `self.dwa_modules._init_weights()` again in that case.\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fdenseformer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fepfml%2Fdenseformer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fdenseformer/lists"}