{"id":15503012,"url":"https://github.com/esceptico/squeezer","last_synced_at":"2025-04-22T23:21:33.804Z","repository":{"id":114860227,"uuid":"418156213","full_name":"esceptico/squeezer","owner":"esceptico","description":"Lightweight knowledge distillation pipeline","archived":false,"fork":false,"pushed_at":"2021-11-29T08:30:34.000Z","size":119,"stargazers_count":28,"open_issues_count":0,"forks_count":0,"subscribers_count":4,"default_branch":"master","last_synced_at":"2024-10-19T17:30:36.203Z","etag":null,"topics":["distillation","knowledge-distillation","model-compression","pytorch"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/esceptico.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-10-17T14:23:51.000Z","updated_at":"2024-01-04T17:02:25.000Z","dependencies_parsed_at":"2023-03-13T13:04:32.828Z","dependency_job_id":null,"html_url":"https://github.com/esceptico/squeezer","commit_stats":{"total_commits":24,"total_committers":2,"mean_commits":12.0,"dds":0.08333333333333337,"last_synced_commit":"98bc4c7923c6aa3b12ac81444d79392826fc34c6"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/esceptico%2Fsqueezer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/esceptico%2Fsqueezer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/esceptico%2Fsqueezer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/esceptico%2Fsqueezer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/esceptico","download_url":"https://codeload.github.com/esceptico/squeezer/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250338497,"owners_count":21414196,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["distillation","knowledge-distillation","model-compression","pytorch"],"created_at":"2024-10-02T09:11:51.495Z","updated_at":"2025-04-22T23:21:33.797Z","avatar_url":"https://github.com/esceptico.png","language":"Jupyter Notebook","readme":"# Squeezer (WIP)\n\n## Usage\n### Step 1: Define Distiller class\nImplement `teacher_forward`, `student_forward` \nand (if required) `move_batch_to_device` methods.\n```python\nfrom squeezer import Distiller\n\n\nclass CustomDistiller(Distiller):\n    def teacher_forward(self, batch):\n        return self.teacher(batch['data'])\n\n    def student_forward(self, batch):\n        return self.student(batch['data'])\n```\n### Step 2: Define LossPolicy\n```python\nfrom torch.nn.functional import mse_loss\n\nfrom squeezer import AbstractDistillationPolicy\n\n\nclass DistillationPolicy(AbstractDistillationPolicy):\n    def forward(self, teacher_output, student_output, batch, epoch):\n        loss_mse = mse_loss(student_output, teacher_output)\n        loss_dict = {'mse': loss_mse.item()}\n        return loss_mse, loss_dict\n```\n\n### Step 3: Fit\n```python\nfrom torch import optim\n\nfrom squeezer.logging import TensorboardLogger\n\n\ntrain_loader = ...\n\nteacher = Teacher()\nstudent = Student()\n\nlogger = TensorboardLogger('runs', 'experiment')\noptimizer = optim.AdamW(student.parameters(), lr=3e-4)\npolicy = DistillationPolicy()\ndistiller = CustomDistiller(teacher, student, policy, optimizer=optimizer, logger=logger)\n\ndistiller(train_loader, n_epochs=10)\ndistiller.save('path_to_some_directory')\n```","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fesceptico%2Fsqueezer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fesceptico%2Fsqueezer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fesceptico%2Fsqueezer/lists"}