{"id":13534857,"url":"https://github.com/tensorflow/privacy","last_synced_at":"2025-05-13T15:11:13.792Z","repository":{"id":37561584,"uuid":"162747292","full_name":"tensorflow/privacy","owner":"tensorflow","description":"Library for training machine learning models with privacy for training data","archived":false,"fork":false,"pushed_at":"2025-04-23T20:16:27.000Z","size":2961,"stargazers_count":1962,"open_issues_count":130,"forks_count":457,"subscribers_count":58,"default_branch":"master","last_synced_at":"2025-05-08T00:09:45.152Z","etag":null,"topics":["machine-learning","privacy"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/tensorflow.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2018-12-21T18:46:46.000Z","updated_at":"2025-05-02T18:06:30.000Z","dependencies_parsed_at":"2023-02-19T09:45:34.054Z","dependency_job_id":"6968fe88-223d-4d02-9912-46638ec4a9f7","html_url":"https://github.com/tensorflow/privacy","commit_stats":{"total_commits":835,"total_committers":64,"mean_commits":13.046875,"dds":0.8011976047904191,"last_synced_commit":"e8856835a6a76f2806c107a7b96e1b98fd23a3c5"},"previous_names":[],"tags_count":18,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorflow%2Fprivacy","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorflow%2Fprivacy/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorflow%2Fprivacy/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tensorflow%2Fprivacy/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/tensorflow","download_url":"https://codeload.github.com/tensorflow/privacy/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253969259,"owners_count":21992263,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["machine-learning","privacy"],"created_at":"2024-08-01T08:00:44.057Z","updated_at":"2025-05-13T15:11:08.773Z","avatar_url":"https://github.com/tensorflow.png","language":"Python","readme":"# TensorFlow Privacy\n\nThis repository contains the source code for TensorFlow Privacy, a Python\nlibrary that includes implementations of TensorFlow optimizers for training\nmachine learning models with differential privacy. The library comes with\ntutorials and analysis tools for computing the privacy guarantees provided.\n\nThe TensorFlow Privacy library is under continual development, always welcoming\ncontributions. In particular, we always welcome help towards resolving the\nissues currently open.\n\n## Latest Updates\n\n2024-02-14: As of version 0.9.0, the TensorFlow Privacy github repository will\nbe published as two separate PyPI packages. The first will inherit the name\ntensorflow-privacy and contain the parts related to training of DP models. The\nsecond, tensorflow-empirical-privacy, will contain the parts related to testing\nfor empirical privacy.\n\n2023-02-21: A new implementation of efficient per-example gradient clipping is\nnow available for\n[DP keras models](https://github.com/tensorflow/privacy/tree/master/tensorflow_privacy/privacy/keras_models)\nconsisting only of Dense and Embedding layers. The models use the fast gradient\ncalculation results of [this paper](https://arxiv.org/abs/1510.01799). The\nimplementation should allow for doing DP training without any meaningful memory\nor runtime overhead. It also removes the need for tuning the number of\nmicrobatches as it clips the gradient with respect to each example.\n\n## Setting up TensorFlow Privacy\n\n### Dependencies\n\nThis library uses [TensorFlow](https://www.tensorflow.org/) to define machine\nlearning models. Therefore, installing TensorFlow (\u003e= 1.14) is a pre-requisite.\nYou can find instructions [here](https://www.tensorflow.org/install/). For\nbetter performance, it is also recommended to install TensorFlow with GPU\nsupport (detailed instructions on how to do this are available in the TensorFlow\ninstallation documentation).\n\n### Installing TensorFlow Privacy\n\nIf you only want to use TensorFlow Privacy as a library, you can simply execute\n\n`pip install tensorflow-privacy`\n\nOtherwise, you can clone this GitHub repository into a directory of your choice:\n\n```\ngit clone https://github.com/tensorflow/privacy\n```\n\nYou can then install the local package in \"editable\" mode in order to add it to\nyour `PYTHONPATH`:\n\n```\ncd privacy\npip install -e .\n```\n\nIf you'd like to make contributions, we recommend first forking the repository\nand then cloning your fork rather than cloning this repository directly.\n\n## Contributing\n\nContributions are welcomed! Bug fixes and new features can be initiated through\nGitHub pull requests. To speed the code review process, we ask that:\n\n*   When making code contributions to TensorFlow Privacy, you follow the `PEP8\n    with two spaces` coding style (the same as the one used by TensorFlow) in\n    your pull requests. In most cases this can be done by running `autopep8 -i\n    --indent-size 2 \u003cfile\u003e` on the files you have edited.\n\n*   You should also check your code with pylint and TensorFlow's pylint\n    [configuration file](https://raw.githubusercontent.com/tensorflow/tensorflow/master/tensorflow/tools/ci_build/pylintrc)\n    by running `pylint --rcfile=/path/to/the/tf/rcfile \u003cedited file.py\u003e`.\n\n*   When making your first pull request, you\n    [sign the Google CLA](https://cla.developers.google.com/clas)\n\n*   We do not accept pull requests that add git submodules because of\n    [the problems that arise when maintaining git submodules](https://medium.com/@porteneuve/mastering-git-submodules-34c65e940407)\n\n## Tutorials directory\n\nTo help you get started with the functionalities provided by this library, we\nprovide a detailed walkthrough [here](tutorials/walkthrough/README.md) that will\nteach you how to wrap existing optimizers (e.g., SGD, Adam, ...) into their\ndifferentially private counterparts using TensorFlow (TF) Privacy. You will also\nlearn how to tune the parameters introduced by differentially private\noptimization and how to measure the privacy guarantees provided using analysis\ntools included in TF Privacy.\n\nIn addition, the `tutorials/` folder comes with scripts demonstrating how to use\nthe library features. The list of tutorials is described in the README included\nin the tutorials directory.\n\nNOTE: the tutorials are maintained carefully. However, they are not considered\npart of the API and they can change at any time without warning. You should not\nwrite 3rd party code that imports the tutorials and expect that the interface\nwill not break.\n\n## Research directory\n\nThis folder contains code to reproduce results from research papers related to\nprivacy in machine learning. It is not maintained as carefully as the tutorials\ndirectory, but rather intended as a convenient archive.\n\n## TensorFlow 2.x\n\nTensorFlow Privacy now works with TensorFlow 2! You can use the new Keras-based\nestimators found in\n`privacy/tensorflow_privacy/privacy/optimizers/dp_optimizer_keras.py`.\n\nFor this to work with `tf.keras.Model` and `tf.estimator.Estimator`, however,\nyou need to install TensorFlow 2.4 or later.\n\n## Remarks\n\nThe content of this repository supersedes the following existing folder in the\ntensorflow/models\n[repository](https://github.com/tensorflow/models/tree/master/research/differential_privacy)\n\n## Contacts\n\nIf you have any questions that cannot be addressed by raising an issue, feel\nfree to contact:\n\n*   Galen Andrew (@galenmandrew)\n*   Steve Chien (@schien1729)\n*   Nicolas Papernot (@npapernot)\n\n## Copyright\n\nCopyright 2019 - Google LLC\n","funding_links":[],"categories":["Software","Differential Privacy Tools","Libraries","Open Source Security Tools","Privacy Preserving Machine Learning","Tools","Data Security \u0026 Poisoning","隐私机器学习","Python","其他_机器学习与深度学习","Awesome Privacy Engineering [![Awesome](https://awesome.re/badge.svg)](https://awesome.re)","\u003ca id=\"tools\"\u003e\u003c/a\u003e🛠️ Tools","Privacy and Safety","5.4 DP Libraries","Code and Projects","Technical Resources","Model Fairness and Privacy","LLM SECURITY / AI SECURITY","Privacy"],"sub_categories":["Interfaces","Privacy","Winetricks","Objective-C Tools, Libraries, and Frameworks","Differential Privacy and Federated Learning","Model Fairness \u0026 Privacy","Mesh networks","Open Source/Access Responsible AI Software Packages","AI Model Security \u0026 Privacy","Professional Privacy"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftensorflow%2Fprivacy","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftensorflow%2Fprivacy","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftensorflow%2Fprivacy/lists"}