https://github.com/layumi/u_turn
IJCV22 :see_no_evil: Attack your retrieval model via Query! They are not robust as you expected! :hear_no_evil:
https://github.com/layumi/u_turn
Last synced: 8 months ago
JSON representation
IJCV22 :see_no_evil: Attack your retrieval model via Query! They are not robust as you expected! :hear_no_evil:
- Host: GitHub
- URL: https://github.com/layumi/u_turn
- Owner: layumi
- License: mit
- Created: 2018-03-22T06:43:48.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2022-11-30T16:45:48.000Z (almost 3 years ago)
- Last Synced: 2024-12-28T04:46:42.479Z (9 months ago)
- Language: Python
- Homepage: https://arxiv.org/abs/1809.02681
- Size: 27 MB
- Stars: 46
- Watchers: 5
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
:see_no_evil: U-Turn :hear_no_evil:
Attack your retrieval model via Query! They are not robust as you expected!
[](https://opensource.org/licenses/MIT)
One simple code to cheat your retrieval model via **Modifying Query ONLY** (based on [pytorch](https://pytorch.org)) accepted by IJCV.
Pre-print version is at https://arxiv.org/abs/1809.02681.The main idea underpinning our method is simple yet effective, making the query feature to conduct a U-turn :arrow_right_hook:.

## Table of contents
* [Re-ID Attacking](#re-id-attacking)
* [Image Retrieval Attacking](#image-retrieval-attacking)
* [Cifar Attacking](#mnist-attacking)## Re-ID Attacking
### 1.1 Preparing your reID models.
Please check the step-by-step tutorial in https://github.com/layumi/Person_reID_baseline_pytorch### 1.2 Attacking Market-1501
Try four attack methods with one line. Please change the path before run it.
```bash
python experiment.py
```## Image Retrieval Attacking
### 2.1 Download the pre-trained model on Oxford and Paris
We attach the training code, which is based on the excellent code in TPAMI 2018.
https://github.com/layumi/Oxford-Paris-Attack### 2.2 Attacking the Oxford and Paris Dataset
Our effort is to cheat the TPAMI model. Yes. We succeed.
https://github.com/layumi/Oxford-Paris-Attack### 2.3 Attacking Food-256 and CUB-200-2011
Please check subfolders.Food: https://github.com/layumi/U_turn/tree/master/Food
CUB: https://github.com/layumi/U_turn/tree/master/cub
## Cifar Attacking
### 3.1 Cifar (ResNet-Wide)
We attach the training code, which is borrowed from ResNet-Wide (with Random Erasing).### 3.2 Attacking Cifar
https://github.com/layumi/A_reID/tree/master/cifar
### Citation
```
@article{zheng2022query,
title={U-turn: Crafting Adversarial Queries with Opposite-direction Features},
author={Zheng, Zhedong and Zheng, Liang and Yang, Yi and Wu, Fei},
journal={IJCV},
year={2022}
}
```