- Notifications
You must be signed in to change notification settings - Fork5
DP-HyperparamTuning offers an array of tools for fast and easy hypertuning of various hyperparameters for the DP-SGD algorithm.
License
AmanPriyanshu/DP-HyperparamTuning
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
The offical repository for all algorithms and code for theEfficient Hyperparameter Optimization for Differentially Private Deep Learning accepted atPPML Workshop @ ACM-CCS'2021.
A streamlined and basic implementation for all modules presented is available at:
fromDP_HyperparamTuning.experiment.train_single_modelimportExperimentfromDP_HyperparamTuning.algorithms.bayesian_optimizationimportBayesianfromDP_HyperparamTuning.algorithms.grid_search_algorithmimportGridSearchfromDP_HyperparamTuning.algorithms.evolutionary_optimizationimportEvolutionaryOptimizationfromDP_HyperparamTuning.algorithms.reinforcement_learning_optimizationimportRLOptimization
e=Experiment(get_model,criterion,train_dataset,test_dataset)b=Bayesian(e.run_experiment,calculate_reward,num_limit,search_space_nm=search_space_nm,search_space_lr=search_space_nm)
Where,get_model
,calculate_reward
are functions, andcriterion
andtrain_dataset, test_dataset
which are<class 'torch.nn.modules.loss.BCELoss'>
andtorch.utils.data.Dataset
respectively.
When contributing to this repository, please first discuss the change you wish to make via issue,email, or any other method with the owners of this repository before making a change. We also makeavailable aCONTRIBUTING.md andCODE_OF_CONDUCT.md for easy communication and quick issue resolution.
@misc{priyanshu2021efficient,title={Efficient Hyperparameter Optimization for Differentially Private Deep Learning},author={Aman Priyanshu and Rakshit Naidu and Fatemehsadat Mireshghallah and Mohammad Malekzadeh},year={2021},eprint={2108.03888},archivePrefix={arXiv},primaryClass={cs.LG}}