Skip to content

A repository that implements black-box migration adversarial attacks to models on cifar10

Notifications You must be signed in to change notification settings

Xingyu-Zheng/cifar-attack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Cifar-Attack

How to use

  1. using train.sh or train_pgd.sh to train a model to be attacked, where train_pgd.sh refers to PGD-based adversarial training
  2. using attack.sh to make adversarial samples

Note

  • utils.py contains some tools to generate stronger adversarial samples
    • Gaussian Ambiguity
    • Logits Merge
  • examples contains some adversarial samples generated by using this repo

Special thanks to torchattacks

About

A repository that implements black-box migration adversarial attacks to models on cifar10

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published