Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Training PyTorch models with differential privacy

License

NotificationsYou must be signed in to change notification settings

meta-pytorch/opacus

Opacus


PyPI DownloadsGitHub ActionsCoverage StatusPRs WelcomeLicense

Opacus is a library that enables training PyTorch modelswith differential privacy. It supports training with minimal code changesrequired on the client, has little impact on training performance, and allowsthe client to online track the privacy budget expended at any given moment.

Target audience

This code release is aimed at two target audiences:

  1. ML practitioners will find this to be a gentle introduction to training amodel with differential privacy as it requires minimal code changes.
  2. Differential Privacy researchers will find this easy to experiment and tinkerwith, allowing them to focus on what matters.

Latest updates

2024-12-18: We updated thistutorial to show howLoRA andpeft library could be used in conjuncture with DP-SGD.

2024-08-20: We introducedFast Gradient Clipping and Ghost Clipping(https://arxiv.org/abs/2110.05679) to Opacus, significantly reducing the memory requirements of DP-SGD. Please refer to ourblogpost for more information.

Installation

The latest release of Opacus can be installed viapip:

pip install opacus

OR, alternatively, viaconda:

conda install -c conda-forge opacus

You can also install directly from the source for the latest features (alongwith its quirks and potentially occasional bugs):

git clone https://github.com/pytorch/opacus.gitcd opacuspip install -e.

Getting started

To train your model with differential privacy, all you need to do is toinstantiate aPrivacyEngine and pass your model, data_loader, and optimizer tothe engine'smake_private() method to obtain their private counterparts.

# define your components as usualmodel=Net()optimizer=SGD(model.parameters(),lr=0.05)data_loader=torch.utils.data.DataLoader(dataset,batch_size=1024)# enter PrivacyEngineprivacy_engine=PrivacyEngine()model,optimizer,data_loader=privacy_engine.make_private(module=model,optimizer=optimizer,data_loader=data_loader,noise_multiplier=1.1,max_grad_norm=1.0,)# Now it's business as usual

TheMNIST exampleshows an end-to-end run using Opacus. Theexamples foldercontains more such examples.

Learn more

Interactive tutorials

We've built a series of IPython-based tutorials as a gentle introduction totraining models with privacy and using various Opacus features.

Technical report and citation

The technical report introducing Opacus, presenting its design principles,mathematical foundations, and benchmarks can be foundhere.

Consider citing the report if you use Opacus in your papers, as follows:

@article{opacus,  title={Opacus: {U}ser-Friendly Differential Privacy Library in {PyTorch}},  author={Ashkan Yousefpour and Igor Shilov and Alexandre Sablayrolles and Davide Testuggine and Karthik Prasad and Mani Malek and John Nguyen and Sayan Ghosh and Akash Bharadwaj and Jessica Zhao and Graham Cormode and Ilya Mironov},  journal={arXiv preprint arXiv:2109.12298},  year={2021}}

Blogposts and talks

If you want to learn more about DP-SGD and related topics, check out our seriesof blogposts and talks:

FAQ

Check out theFAQ page for answers to some of themost frequently asked questions about differential privacy and Opacus.

Contributing

See theCONTRIBUTING filefor how to help out. Do also check out the README files inside the repo to learnhow the code is organized.

License

This code is released under Apache 2.0, as found in theLICENSE file.


[8]ページ先頭

©2009-2025 Movatter.jp