- Notifications
You must be signed in to change notification settings - Fork96
Simulate a federated setting and run differentially private federated learning.
License
SAP-samples/machine-learning-diff-private-federated-learning
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Federated Learning is a privacy preserving decentralized learning protocol introduced by Google. Multiple clients jointly learn a model without data centralization. Centralization is pushed from data space to parameter space:https://research.google.com/pubs/pub44822.html [1].Differential privacy in deep learning is concerned with preserving privacy of individual data points:https://arxiv.org/abs/1607.00133 [2].In this work we combine the notion of both by making federated learning differentially private. We focus on preserving privacy for the entire data set of a client. For more information, please refer to:https://arxiv.org/abs/1712.07557v2.
This code simulates a federated setting and enables federated learning with differential privacy. The privacy accountant used is fromhttps://arxiv.org/abs/1607.00133 [2]. The files: accountant.py, utils.py, gaussian_moments.py are taken from:https://github.com/tensorflow/models/tree/master/research/differential_privacy
Note that the privacy agent is not completely set up yet (especially for more than 100 clients). It has to be specified manually or otherwise parameters 'm' and 'sigma' need to be specified.
Install Tensorflow 1.4.12Download the files as a ZIP archive, or you canclone the repository to your local hard drive.
Change to the directory of the download, If using macOS, simply run:
bash RUNME.sh
This will download theMNIST data-sets, create clients and getting started.
For more information on the individual functions, please refer to their doc strings.
No issues known
This project is provided "as-is" and any bug reports are not guaranteed to be fixed.
If you use this code or the pretrained models in your research,please cite:
@ARTICLE{2017arXiv171207557G, author = {{Geyer}, R.~C. and {Klein}, T. and {Nabi}, M.}, title = "{Differentially Private Federated Learning: A Client Level Perspective}", journal = {ArXiv e-prints},archivePrefix = "arXiv", eprint = {1712.07557}, primaryClass = "cs.CR", keywords = {Computer Science - Cryptography and Security, Computer Science - Learning, Statistics - Machine Learning}, year = 2017, month = dec, adsurl = {http://adsabs.harvard.edu/abs/2017arXiv171207557G}, adsnote = {Provided by the SAO/NASA Astrophysics Data System}}H. Brendan McMahan et al., Communication-Efficient Learning of Deep Networks from Decentralized Data, 2017,http://arxiv.org/abs/1602.05629.
Martin Abadi et al., Deep Learning with Differential Privacy, 2016,https://arxiv.org/abs/1607.00133.
Copyright (c) 2024 SAP SE or an SAP affiliate company. All rights reserved. This project is licensed under the Apache Software License, version 2.0 except as noted otherwise in theLICENSE file.
About
Simulate a federated setting and run differentially private federated learning.
Topics
Resources
License
Code of conduct
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors7
Uh oh!
There was an error while loading.Please reload this page.