{"id":19066196,"url":"https://github.com/epfml/ghost-noise","last_synced_at":"2025-08-09T17:04:11.212Z","repository":{"id":170927941,"uuid":"645258020","full_name":"epfml/ghost-noise","owner":"epfml","description":null,"archived":false,"fork":false,"pushed_at":"2023-08-18T13:33:16.000Z","size":2359,"stargazers_count":4,"open_issues_count":0,"forks_count":0,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-06-13T18:48:58.828Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/epfml.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-05-25T09:01:06.000Z","updated_at":"2024-02-02T05:15:59.000Z","dependencies_parsed_at":null,"dependency_job_id":"a72088fc-547f-469c-9d67-ccee804f9385","html_url":"https://github.com/epfml/ghost-noise","commit_stats":null,"previous_names":["epfml/ghost-noise"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/epfml/ghost-noise","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fghost-noise","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fghost-noise/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fghost-noise/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fghost-noise/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/epfml","download_url":"https://codeload.github.com/epfml/ghost-noise/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/epfml%2Fghost-noise/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":269606235,"owners_count":24446147,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-09T02:00:10.424Z","response_time":111,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-09T00:55:23.790Z","updated_at":"2025-08-09T17:04:11.149Z","avatar_url":"https://github.com/epfml.png","language":"Python","readme":"# Ghost Noise for Regularizing Deep Neural Networks\nThis is the official code for our manuscript https://arxiv.org/abs/2305.17205 where we investigate the regularization effects of using finite batch sizes with batch normalization. The abstract is repeated below:\n\nBatch Normalization (BN) is widely used to stabilize the optimization process and improve the test performance of deep neural networks. The regularization effect of BN depends on the batch size and explicitly using smaller batch sizes with Batch Normalization, a method known as Ghost Batch Normalization (GBN), has been found to improve generalization in many settings. We investigate the effectiveness of GBN by disentangling the induced \"Ghost Noise\" from normalization and quantitatively analyzing the distribution of noise as well as its impact on model performance. Inspired by our analysis, we propose a new regularization technique called Ghost Noise Injection (GNI) that imitates the noise in GBN without incurring the detrimental train-test discrepancy effects of small batch training. We experimentally show that GNI can provide a greater generalization benefit than GBN. Ghost Noise Injection can also be beneficial in otherwise non-noisy settings such as layer-normalized networks, providing additional evidence of the usefulness of Ghost Noise in Batch Normalization as a regularizer.\n\n# Code Organization\nOur core implementations are in the nodo subdirectory, in particular [nodo/ghost_noise_injector_replacement.py](nodo/ghost_noise_injector_replacement.py) which implements Ghost Noise Injection using sampling with replacement.\n\nWe use a slightly modified version of the [TIMM library](https://huggingface.co/docs/timm/index) by Ross Wightman and HuggingFace in our experiments.\nOur version is in [submodules/timm/](submodules/timm/). \nThe modifications primarily focus on CIFAR compatibility and more flexibility in the construction of the models.\n\nThe [shared/](shared/) subdirectory contains tools to integrate our code with TIMM.\nOur experiments expect the root directory to be in the PYTHONPATH enviroment variable and looks for the shared directory.\nThis can be done by adding `PYTHONPATH=$PYTHONPATH:/path/to/current/dir` in front of your python launch command.\n\nExamples of launch commands can be found in the [experiment_scripts folder](experiment_scripts).\n\nWe an exported conda environment in [environment.yml](environment.yml) that we used in our runs. Our code only relies on the core PyTorch library but TIMM has additional dependencies.\nWe use [wandb](https://wandb.ai/) for experiment logging through TIMM.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fghost-noise","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fepfml%2Fghost-noise","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fepfml%2Fghost-noise/lists"}