Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/TuSimple/rl-multishot-reid
Multi-shot Pedestrian Re-identification via Sequential Decision Making (CVPR2018)
https://github.com/TuSimple/rl-multishot-reid
Last synced: 2 months ago
JSON representation
Multi-shot Pedestrian Re-identification via Sequential Decision Making (CVPR2018)
- Host: GitHub
- URL: https://github.com/TuSimple/rl-multishot-reid
- Owner: TuSimple
- Created: 2017-12-21T02:57:36.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2017-12-21T02:58:35.000Z (about 7 years ago)
- Last Synced: 2024-08-01T22:41:04.515Z (5 months ago)
- Language: Python
- Homepage:
- Size: 51.8 KB
- Stars: 93
- Watchers: 16
- Forks: 26
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- Awesome-MXNet - rl-multishot-reid
README
* [Multi-shot Re-identification](#1)
* [Preparations](#1.1)
* [Usage](#1.2)Multi-shot Re-identification Based on Reinforcement Learning
---
Training and testing codes for multi-shot Re-Identification. Currently, these codes are tested on the PRID-2011 dataset, iLiDS-VID dataset and MARS dataset. For algorithm details and experiment results, please refer our paper: [Multi-shot Pedestrian Re-identification via Sequential Decision Making](https://arxiv.org/abs/1712.07257)
Preparations
---
Before starting running this code, you should make the following preparations:
* Download the [MARS](http://www.liangzheng.com.cn/Project/project_mars.html)
, [iLIDS-VID](http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html) and [PRID-2011](https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/).
* Install MXNet following the [instructions](http://mxnet.io/get_started/index.html#setup-and-installation) and install the python interface. Currently the repo is tested on commit e06c55.Usage
---
* Download the datasets and unzip.
* Prepare data file. Generate image list file according to the file `preprocess_ilds_image.py`
, `preprocess_prid_image.py` and `preprocess_mars_image.py` under `baseline` folder.
* The code is split to two stage, the first stage is a image based re-id task,
please refer the script `run.sh` in `baseline` folder. The codes for this stage is based on [this repo](https://github.com/TuSimple/re-identification). The usage is:
```shell
sh run.sh $gpu $dataset $network $recfloder
```
e.g. If you want to train MARS dataset on gpu 0 using inception-bn, please run:
```shell
sh run.sh 0 MARS inception-bn /data3/matt/MARS/recs
```
* The second stage is a multi-shot re-id task based on reinforcement learning.
Please refer the script `run.sh` in `RL` folder. The usage is:
```shell
sh run.sh $gpu $unsure-penalty $dataset $network $recfloder
```
* For evaluation, please use `baseline/baseline_test.py` and `RL/find_eg.py`. In `RL/find_eg.py`, we also show some example episodes with good quality generated by our algorithm.