Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

awesome-differential-privacy


https://github.com/Stopwolf/awesome-differential-privacy

  • The Algorithmic Foundations of Differential Privacy - Book from creators of Differential Privacy, Cynthia Dwork and Aaron Roth. Wouldn't recommend as an introductory resource though.
  • Programming Differential Privacy - Book by Joseph P. Near and Chiké Abuah. Created for a course at University of Vermont. Full of examples and Python code.
  • Differential Privacy: A Primer for a Non-Technical Audience - Book by Wood et al. (including Kobbi Nissim), also analyzes legal aspects.
  • CS 860 - Algorithms for Private Data Analysis - Course taught by Gautam Kamath at University of Waterloo. Course has lecture videos (as a [YouTube playlist](https://www.youtube.com/playlist?list=PLmd_zeMNzSvRRNpoEWkVo6QY_6rR3SHjp)), lecture notes and additional readings. More theoretical, but an **excellent introductory** course to Differential Privacy.
  • CS211: Data Privacy - Course taught by Joe Near and Protiva Sen at University of Vermont. Exclusively lecture slides (no videos), homework and weekly assignments via Jupyter Notebooks.
  • Privacy Preserving Machine Learning - Course taught by Aurélien Bellet at University of Lille. Exclusively lecture slides (no videos), practical sessions in Jupyter Notebooks. Definitely more of an advanced course.
  • Damien Desfontaines' Personal Blog - His personal curated list of blogs which serve as a friendly introduction to differential privacy.
  • differentialprivacy.org - Resource for the differential privacy research community and all of those who want to learn more about it. Also has a mailing list.
  • gretel.ai Blog - [Gretel.ai]()'s blog about privacy in machine learning, differential privacy and data sharing. More focused on creating synthetic data.
  • OpenMined Blog - All OpenMined blogs on differential privacy topic.
  • PyTorch Blog - Differential Privacy Series currently consisting of [two](https://medium.com/pytorch/differential-privacy-series-part-1-dp-sgd-algorithm-explained-12512c3959a3) [parts](https://medium.com/pytorch/differential-privacy-series-part-2-efficient-per-sample-gradient-computation-in-opacus-5bf4031d9e22) explaining concepts like differential privacy, DP-SGD and their inner-workings in Opacus.
  • opacus - PyTorch based library for Differential Privacy. You can read the whitepaper [here](https://arxiv.org/abs/2109.12298).
  • tensorflow-privacy - TensorFlow library for Differential Privacy.
  • PyDP - OpenMined's Python wrapper library of Google's differential privacy library.
  • PrivacyRaven - Privacy testing Python library for deep learning systems.
  • diffprivlib - IBM's Python library for Differential Privacy.
  • deepee - Implements DP-SGD in PyTorch, but works for all first-order optimizers.
  • pyvacy - Implementation DP-SGD for PyTorch.
  • autodp - Library for automating Differential Privacy computation.
  • Deep Learning with Differential Privacy - Paper that introduced Differentially Private SGD optimizator that enabled private deep learning.
  • Learning Differentially Private Recurrent Language Models - Paper that introduced training of large recurrent language models with user-level differential privacy guarantees with only a negligible cost in predictive accuracy.
  • Privacy Regularization: Joint Privacy-Utility Optimization in Language Models - Paper that introduced privacy regularization of LMs in order to try to preserve model utility and uniform treatment of under-represented subgroups.
  • Removing Disparate Impact of DP-SGD on Model Accuracy - Paper that introduced changes of regular DP-SGD optimizator in order to make it fair. You can view my paper summary (presentation) [here](https://docs.google.com/presentation/d/1aUpQd9LmcsML726c7EDiiFusx0KDIK2t4lPJEuy4gCM/edit?usp=sharing).
  • The Secret Sharer: Evaluating and Testing Unintended Memorization in Neural Networks - Paper that observed unintended memorization in LMs, and how they used differential privacy to avoid that memorization.