- Notifications
You must be signed in to change notification settings - Fork6
reveal the vulnerabilities of SplitNN
License
NotificationsYou must be signed in to change notification settings
Koukyosyumei/Attack_SplitNN
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Attacking_SplitNN
allows you to easily experiment with various combinations of attack and defense algorithms against SplitNN within PyTorch and scikit-learn.
pip install git+https://github.com/Koukyosyumei/Attack_SplitNN
You can easily create two-SplitNN with this package as follows.
The client only has input data, and the server has only labels.This package implements SplitNN as the custom torch.nn.modules, so youcan train SplitNN like the normal torch models.
Examples: model_1 = FirstNet() model_1 = model_1.to(device) model_2 = SecondNet() model_2 = model_2.to(device) opt_1 = optim.Adam(model_1.parameters(), lr=1e-3) opt_2 = optim.Adam(model_2.parameters(), lr=1e-3) criterion = nn.BCELoss() client = Client(model_1) server = Server(model_2) splitnn = SplitNN(client, server, opt_1, opt_2) splitnn.train() for epoch in range(3): epoch_loss = 0 epoch_outputs = [] epoch_labels = [] for i, data in enumerate(train_loader): splitnn.zero_grads() inputs, labels = data inputs = inputs.to(device) labels = labels.to(device) outputs = splitnn(inputs) loss = criterion(outputs, labels) loss.backward() epoch_loss += loss.item() / len(train_loader.dataset) epoch_outputs.append(outputs) epoch_labels.append(labels) splitnn.backward() splitnn.step() print(epoch_loss, torch_auc(torch.cat(epoch_labels), torch.cat(epoch_outputs)))
Attacking_SplitNN
offers several attack methods with the same interface.
type | example | Reference | |
---|---|---|---|
Intermidiate Level Attack | evasion attack | notebook | original paper |
Norm Attack | label leakage attack | notebook | original paper |
Transfer Inherit Attack | membership inference attack | notebook | original paper |
Black Box Model Inversion Attack | model inversion attack | notebook | blog |
example | Reference | |
---|---|---|
Max Norm | notebook | original paper |
NoPeek | notebook | original paper |
Shredder | notebook | original paper |
This software is released under the MIT License, see LICENSE.txt.
About
reveal the vulnerabilities of SplitNN
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
No releases published
Packages0
No packages published
Uh oh!
There was an error while loading.Please reload this page.