Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ohhhyeahhh/SiamGAT

Code for the paper "Graph Attention Tracking". (CVPR2021)
https://github.com/ohhhyeahhh/SiamGAT

Last synced: 6 days ago
JSON representation

Code for the paper "Graph Attention Tracking". (CVPR2021)

Awesome Lists containing this project

README

        

# SiamGAT

## 1. Environment setup
This code has been tested on Ubuntu 16.04, Python 3.5, Pytorch 1.2.0, CUDA 9.0.
Please install related libraries before running this code:
```bash
pip install -r requirements.txt
```

## 2. Test



Dataset
SiamGAT
SiamGAT*
SiamGAT Model Link
SiamGAT* Model Link


GOT10k
AO
63.1 67.1

Google Driver/

BaiduYun(zktx)


Google Driver/

BaiduYun(d74o)



SR0.5
74.6 78.7


SR0.75
50.4 58.9


TrackingNet
Success
75.3 76.9

Google Driver/

BaiduYun(n2sm)


Google Driver/

BaiduYun(fxo2)



Norm precision
80.7 82.4


Precision
69.8 71.9


LaSOT
Success
53.9 59.5

Google Driver/

BaiduYun(dilp)



Norm precision
63.3 69.0


Precision
53.0 61.2


VOT2020
EAO
- 0.453
-


A
- 0.756


R
- 0.729


OTB100
Success
71.0 71.5

Google Driver/

BaiduYun(w1rs)


Google Driver/

BaiduYun(c6c5)



Precision
91.7 93.0


UAV123
Success
64.6 -
-


Precision
84.3 -

### Prepare testing datasets
Download testing datasets and put them into `test_dataset` directory. Jsons of commonly used datasets can be downloaded from [BaiduYun](https://pan.baidu.com/s/1js0Qhykqqur7_lNRtle1tA#list/path=%2F). If you want to test the tracker on a new dataset, please refer to [pysot-toolkit](https://github.com/StrangerZhang/pysot-toolkit) to set test_dataset.

### Test the tracker
```bash
python testTracker.py \
--config ../experiments/siamgat_googlenet_ct_alldataset/config.yaml \ # siamgat_xx_xx for SiamGAT, siamgat_ct_xx_xx for SiamGAT*
--dataset OTB100 \ # dataset_name: GOT-10k, LaSOT, TrackingNet, OTB100, UAV123
--snapshot snapshot/otb_uav_model.pth # tracker_name
```
The testing result will be saved in the `results/dataset_name/tracker_name` directory.

## 3. Train

### Prepare training datasets

Download the datasets:
* [VID](http://image-net.org/challenges/LSVRC/2017/)
* [YOUTUBEBB](https://pan.baidu.com/s/1gQKmi7o7HCw954JriLXYvg) (code: v7s6)
* [DET](http://image-net.org/challenges/LSVRC/2017/)
* [COCO](http://cocodataset.org)
* [GOT-10K](http://got-10k.aitestunion.com/downloads)
* [LaSOT](https://cis.temple.edu/lasot/)
* [TrackingNet](https://tracking-net.org/#downloads)

**Note:** `training_dataset/dataset_name/readme.md` has listed detailed operations about how to generate training datasets.

### Download pretrained backbones
Download pretrained backbones from [link](https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth) and put them into `pretrained_models` directory.

### Train a model
To train the SiamGAT model, run `train.py` with the desired configs:

```bash
cd tools
python train.py
--cfg ../experiments/siamgat_googlenet/config.yaml # siamgat_xx_xx for SiamGAT, siamgat_ct_xx_xx for SiamGAT*
```

## 4. Evaluation

We provide tracking results for comparison:
- SiamGAT: [BaiduYun](https://pan.baidu.com/s/1HBE0Kn2ietvQT7NLExAQoA) (extract code: 8nox) or [GoogleDriver](https://drive.google.com/file/d/1xAbTfJNKpGJykdFrTDGtHQtxgKHOhgL7/view?usp=sharing).
- SiamGAT*: [BaiduYun](https://pan.baidu.com/s/1dWhUxsJyE37d8PfOdqFR_g) (extract code: kjym) or [GoogleDriver](https://drive.google.com/file/d/19nzlqz9aCswQwnnvc9AS7btAg_uLCTYI/view?usp=sharing).

If you want to evaluate the tracker on OTB100, UAV123 and LaSOT, please put those results into `results` directory and then run `eval.py` .
Evaluate GOT-10k on [Server](http://got-10k.aitestunion.com/). Evaluate TrackingNet on [Server](https://tracking-net.org/).

```
python eval.py \
--tracker_path ./results \ # result path
--dataset OTB100 \ # dataset_name
--tracker_prefix 'otb_uav_model' # tracker_name
```

## 5. Acknowledgement
The code is implemented based on [pysot](https://github.com/STVIR/pysot) and [SiamCAR](https://github.com/ohhhyeahhh/SiamCAR). We would like to express our sincere thanks to the contributors.

## 6. Cite
If you use SiamGAT in your work please cite our papers:

> @article{cui2022joint,
title={Joint Classification and Regression for Visual Tracking with Fully Convolutional Siamese Networks},
author={Cui, Ying and Guo, Dongyan and Shao, Yanyan and Wang, Zhenhua and Shen, Chunhua and Zhang, Liyan and Chen, Shengyong},
journal={International Journal of Computer Vision},
year={2022},
publisher={Springer},
doi = {10.1007/s11263-021-01559-4}
}

> @InProceedings{Guo_2021_CVPR,
author = {Guo, Dongyan and Shao, Yanyan and Cui, Ying and Wang, Zhenhua and Zhang, Liyan and Shen, Chunhua},
title = {Graph Attention Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2021}
}

> @InProceedings{Guo_2020_CVPR,
author = {Guo, Dongyan and Wang, Jun and Cui, Ying and Wang, Zhenhua and Chen, Shengyong},
title = {SiamCAR: Siamese Fully Convolutional Classification and Regression for Visual Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}