Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/chenxin-dlut/TransT
Transformer Tracking (CVPR2021)
https://github.com/chenxin-dlut/TransT
Last synced: 6 days ago
JSON representation
Transformer Tracking (CVPR2021)
- Host: GitHub
- URL: https://github.com/chenxin-dlut/TransT
- Owner: chenxin-dlut
- License: gpl-3.0
- Created: 2021-03-17T04:10:24.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2023-07-01T08:29:45.000Z (over 1 year ago)
- Last Synced: 2024-08-02T06:13:01.991Z (3 months ago)
- Language: Python
- Homepage:
- Size: 1.78 MB
- Stars: 563
- Watchers: 9
- Forks: 104
- Open Issues: 54
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-Visual-Object-Tracking - [code
README
# TransT - Transformer Tracking [CVPR2021]
Official implementation of the TransT (CVPR2021) , including training code and trained models.## News
- :trophy: **TransT-M wins VOT2021 Real-Time Challenge with EAOMultistart 0.550! The code will be released soon**## Tracker
#### TransT ####[**[Paper]**](https://arxiv.org/abs/2103.15436)
[**[Models(google)]**](https://drive.google.com/drive/folders/1GVQV1GoW-ttDJRRqaVAtLUtubtgLhWCE?usp=sharing)
[**[Models(baidu:iiau)]**](https://pan.baidu.com/s/1geI1cIv_AdLUd7qYKWIqzw)
[**[Raw Results]**](https://drive.google.com/file/d/1FSUh6NSzu8H2HzectIwCbDEKZo8ZKUro/view?usp=sharing)This work
presents a attention-based feature fusion network,
which effectively combines the template and search region
features using attention. Specifically, the proposed
method includes an ego-context augment module based on
self-attention and a cross-feature augment module based on
cross-attention. We present a Transformer tracking
(named TransT) method based on the Siamese-like feature extraction
backbone, the designed attention-based fusion mechanism,
and the classification and regression head.TransT is a very simple and efficient tracker,
without online update module, using the same model and hyparameter for all
test sets.
![TransT overview figure](pytracking/.figs/Framework.png)
![ECA and CFA](pytracking/.figs/ECACFA.png)## Results
For VOT2020, we add a mask branch to generate mask, without any hyparameter-tuning. The code of the mask branch will be released soon.
Model
LaSOT
AUC (%)
TrackingNet
AUC (%)
GOT-10k
AO (%)
VOT2020
EAO (%)
TNL2K
AUC (%)
OTB100
AUC (%)
NFS
AUC (%)
UAV123
AUC (%)
Speed
Params
TransT-N2
64.2
80.9
69.9
-
-
68.1
65.7
67.0
70fps
16.7M
TransT-N4
64.9
81.4
72.3
49.5
51.0
69.4
65.7
69.1
50fps
23.0M
## Installation
This document contains detailed instructions for installing the necessary dependencied for **TransT**. The instructions
have been tested on Ubuntu 18.04 system.#### Install dependencies
* Create and activate a conda environment
```bash
conda create -n transt python=3.7
conda activate transt
```
* Install PyTorch
```bash
conda install -c pytorch pytorch=1.5 torchvision=0.6.1 cudatoolkit=10.2
```* Install other packages
```bash
conda install matplotlib pandas tqdm
pip install opencv-python tb-nightly visdom scikit-image tikzplotlib gdown
conda install cython scipy
sudo apt-get install libturbojpeg
pip install pycocotools jpeg4py
pip install wget yacs
pip install shapely==1.6.4.post2
```
* Setup the environment
Create the default environment setting files.```bash
# Change directory to
cd TransT
# Environment settings for pytracking. Saved at pytracking/evaluation/local.py
python -c "from pytracking.evaluation.environment import create_default_local_file; create_default_local_file()"
# Environment settings for ltr. Saved at ltr/admin/local.py
python -c "from ltr.admin.environment import create_default_local_file; create_default_local_file()"
```
You can modify these files to set the paths to datasets, results paths etc.
* Add the project path to environment variables
Open ~/.bashrc, and add the following line to the end. Note to change to your real path.
```
export PYTHONPATH=:$PYTHONPATH
```
* Download the pre-trained networks
Download the network for [TransT](https://drive.google.com/drive/folders/1GVQV1GoW-ttDJRRqaVAtLUtubtgLhWCE?usp=sharing)
and put it in the directory set by "network_path" in "pytracking/evaluation/local.py". By default, it is set to
pytracking/networks.## Quick Start
#### Traning
* Modify [local.py](ltr/admin/local.py) to set the paths to datasets, results paths etc.
* Runing the following commands to train the TransT. You can customize some parameters by modifying [transt.py](ltr/train_settings/transt/transt.py)
```bash
conda activate transt
cd TransT/ltr
python run_training.py transt transt
```#### Evaluation
* We integrated [PySOT](https://github.com/STVIR/pysot) for evaluation. You can download json files in [PySOT](https://github.com/STVIR/pysot) or [here](https://drive.google.com/file/d/1PItNIOkui0iGCRglgsZPZF1-hkmj7vyv/view?usp=sharing).
You need to specify the path of the model and dataset in the [test.py](pysot_toolkit/test.py).
```python
net_path = '/path_to_model' #Absolute path of the model
dataset_root= '/path_to_datasets' #Absolute path of the datasets
```
Then run the following commands.
```bash
conda activate TransT
cd TransT
python -u pysot_toolkit/test.py --dataset --name 'transt' #test tracker #test tracker
python pysot_toolkit/eval.py --tracker_path results/ --dataset --num 1 --tracker_prefix 'transt' #eval tracker
```
The testing results will in the current directory(results/dataset/transt/)
* You can also use [pytracking](pytracking) to test and evaluate tracker.
The results might be slightly different with [PySOT](https://github.com/STVIR/pysot) due to the slight difference in implementation (pytracking saves results as integers, pysot toolkit saves the results as decimals).#### Getting Help
If you meet problem, please try searching our Github issues, if you can't find solutions, feel free to open a new issue.
* `ImportError: cannot import name region`Solution: You can just delete `from pysot_toolkit.toolkit.utils.region import vot_overlap, vot_float2str` in [test.py](pysot_toolkit/test.py) if you don't test VOT2019/18/16.
You can also build `region` by `python setup.py build_ext --inplace` in [pysot_toolkit](pysot_toolkit).## Citation
```
@inproceedings{TransT,
title={Transformer Tracking},
author={Chen, Xin and Yan, Bin and Zhu, Jiawen and Wang, Dong and Yang, Xiaoyun and Lu, Huchuan},
booktitle={CVPR},
year={2021}
}
```## Acknowledgement
This is a modified version of the python framework [PyTracking](https://github.com/visionml/pytracking) based on **Pytorch**,
also borrowing from [PySOT](https://github.com/STVIR/pysot) and [detr](https://github.com/facebookresearch/detr).
We would like to thank their authors for providing great frameworks and toolkits.## Contact
* Xin Chen (email:[email protected])Feel free to contact me if you have additional questions.