https://github.com/kartikey2807/differentially-private-sgd
Implement Differentially-private SGD. Large ε leads to loss of accuracy%.
https://github.com/kartikey2807/differentially-private-sgd
differential-privacy dpsgd opacus stochastic-gradient-descent
Last synced: about 2 months ago
JSON representation
Implement Differentially-private SGD. Large ε leads to loss of accuracy%.
- Host: GitHub
- URL: https://github.com/kartikey2807/differentially-private-sgd
- Owner: kartikey2807
- Created: 2025-05-26T09:59:36.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-08-16T00:17:30.000Z (2 months ago)
- Last Synced: 2025-08-16T02:35:59.303Z (2 months ago)
- Topics: differential-privacy, dpsgd, opacus, stochastic-gradient-descent
- Language: Jupyter Notebook
- Homepage:
- Size: 22.6 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Differentially-Private SGD
References: [Abadi, Martin et al.](https://arxiv.org/pdf/1607.00133)
**Summary**
* Differential privacy added to *gradient step*
* Tradeoff exists between utility and privacy
* Having privacy budget $\epsilon$ ensure weights are private
* Gradient access private --> Model is differentially private (**post-hoc**)---
**Results**
On the MNIST dataset
|Model|Test Accuracy|
| :- | :- |
|Vanilla|97.00%|
|Differentially Private|91.00%|