Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

DP-HyperparamTuning offers an array of tools for fast and easy hypertuning of various hyperparameters for the DP-SGD algorithm.

License

NotificationsYou must be signed in to change notification settings

AmanPriyanshu/DP-HyperparamTuning

Repository files navigation

The offical repository for all algorithms and code for theEfficient Hyperparameter Optimization for Differentially Private Deep Learning accepted atPPML Workshop @ ACM-CCS'2021.

A streamlined and basic implementation for all modules presented is available at:

Note:Colab Demo for the same

Implementation:

Imports:

fromDP_HyperparamTuning.experiment.train_single_modelimportExperimentfromDP_HyperparamTuning.algorithms.bayesian_optimizationimportBayesianfromDP_HyperparamTuning.algorithms.grid_search_algorithmimportGridSearchfromDP_HyperparamTuning.algorithms.evolutionary_optimizationimportEvolutionaryOptimizationfromDP_HyperparamTuning.algorithms.reinforcement_learning_optimizationimportRLOptimization

Running Given Modules:

e=Experiment(get_model,criterion,train_dataset,test_dataset)b=Bayesian(e.run_experiment,calculate_reward,num_limit,search_space_nm=search_space_nm,search_space_lr=search_space_nm)

Where,get_model,calculate_reward are functions, andcriterion andtrain_dataset, test_dataset which are<class 'torch.nn.modules.loss.BCELoss'> andtorch.utils.data.Dataset respectively.

Contributing

When contributing to this repository, please first discuss the change you wish to make via issue,email, or any other method with the owners of this repository before making a change. We also makeavailable aCONTRIBUTING.md andCODE_OF_CONDUCT.md for easy communication and quick issue resolution.

Paper Citation:

@misc{priyanshu2021efficient,title={Efficient Hyperparameter Optimization for Differentially Private Deep Learning},author={Aman Priyanshu and Rakshit Naidu and Fatemehsadat Mireshghallah and Mohammad Malekzadeh},year={2021},eprint={2108.03888},archivePrefix={arXiv},primaryClass={cs.LG}}

About

DP-HyperparamTuning offers an array of tools for fast and easy hypertuning of various hyperparameters for the DP-SGD algorithm.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp