{"id":13604389,"url":"https://github.com/DachengLi1/MPCFormer","last_synced_at":"2025-04-12T02:30:49.004Z","repository":{"id":65438308,"uuid":"529412525","full_name":"DachengLi1/MPCFormer","owner":"DachengLi1","description":"(ICLR 2023 Spotlight) MPCFormer: fast, performant, and private transformer inference with MPC","archived":false,"fork":false,"pushed_at":"2023-06-12T02:42:07.000Z","size":12954,"stargazers_count":94,"open_issues_count":2,"forks_count":15,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-04-02T05:14:38.199Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/DachengLi1.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2022-08-26T21:44:04.000Z","updated_at":"2025-04-01T08:53:51.000Z","dependencies_parsed_at":"2024-01-19T10:12:40.849Z","dependency_job_id":"d2ae7b0e-f1f1-445e-8354-fafa0814c0fb","html_url":"https://github.com/DachengLi1/MPCFormer","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DachengLi1%2FMPCFormer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DachengLi1%2FMPCFormer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DachengLi1%2FMPCFormer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DachengLi1%2FMPCFormer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/DachengLi1","download_url":"https://codeload.github.com/DachengLi1/MPCFormer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248506901,"owners_count":21115503,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T19:00:44.870Z","updated_at":"2025-04-12T02:30:43.995Z","avatar_url":"https://github.com/DachengLi1.png","language":"Python","readme":"# MPCFormer: fast, performant, and private transformer inference with MPC.\r\n[**Paper**](https://arxiv.org/pdf/2211.01452.pdf) | \r\n[**Usage**](#usage) |\r\n[**Citation**](#citation) |\r\n[**Video**](https://recorder-v3.slideslive.com/?share=81100\u0026s=69c62c48-1730-4eb9-84ae-586fa4ff937d) |\r\n\r\nThis repository contains the official code for our ICLR 2023 spotlight paper MPCFormer: fast, performant, and private transformer inference with MPC.\r\nWe design MPCFormer to protect users' data privacy by using Secure Multiparty Computation(MPC). It also meets other real-world requirements:\r\n- inference latency: by replacing bottleneck functions by their MPC-friendly ones.\r\n- ML performance: by introducing a subsequent Knowledge-Distillation(KD) procedure.\r\n\r\n\u003cimg src=\"figures/workflow.png\" width=\"600\"\u003e\r\n\r\n## Performance \r\nIt achieves 5.26x speedup for Bert-Base MPC inference, while preserving a similar ML accuracy. More comprehensive results such as on Bert-Large, Roberta, can be found in the paper.\r\n\r\n\u003cimg src=\"figures/result_imdb.PNG\" width=\"300\"\u003e \u003cimg src=\"figures/result_glue.PNG\" width=\"600\"\u003e\r\n\r\n## Usage\r\nTo install necessary packages, install the transformer directory in editor mode:\r\n    \r\n    git clone https://github.com/MccRee177/MPCFormer\r\n    cd MPCFormer/transformers\r\n    pip install -e .\r\n\r\n#### Step 1: Obtain a teacher Transformer model by fine-tuning on downstream tasks [**Here**](src/baselines)\r\nWe support GLUE and Imdb, other datasets can be easily supported via the ransformers library.\r\n\r\n#### Step 2: perform approximation and distillation of MPCFormer [**Here**](src/main).\r\n\r\n#### (Optional) evaluate baselines in the paper [**Here**](src/baselines).\r\n\r\n#### (Optional) Benchmark the inference time of approximated model: [**Here**](src/benchmark).\r\n\r\n\r\n## Citation\r\nIf you find this repository useful, please cite our paper using\r\n````\r\n@article{li2022mpcformer,\r\n  title={MPCFormer: fast, performant and private Transformer inference with MPC},\r\n  author={Li, Dacheng and Shao, Rulin and Wang, Hongyi and Guo, Han and Xing, Eric P and Zhang, Hao},\r\n  journal={arXiv preprint arXiv:2211.01452},\r\n  year={2022}\r\n}\r\n````\r\n\r\n \r\n\r\n\r\n\r\n","funding_links":[],"categories":["Paper-Code"],"sub_categories":["Serving-Inference"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDachengLi1%2FMPCFormer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FDachengLi1%2FMPCFormer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDachengLi1%2FMPCFormer/lists"}