Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/liruilong940607/pose2seg
Code for the paper "Pose2Seg: Detection Free Human Instance Segmentation" @ CVPR2019.
https://github.com/liruilong940607/pose2seg
cvpr2019 human pose-estimation segmentation
Last synced: 1 day ago
JSON representation
Code for the paper "Pose2Seg: Detection Free Human Instance Segmentation" @ CVPR2019.
- Host: GitHub
- URL: https://github.com/liruilong940607/pose2seg
- Owner: liruilong940607
- License: mit
- Created: 2019-03-25T11:41:29.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2023-09-23T16:02:50.000Z (about 1 year ago)
- Last Synced: 2024-12-14T04:07:02.813Z (8 days ago)
- Topics: cvpr2019, human, pose-estimation, segmentation
- Language: Python
- Homepage: http://www.liruilong.cn/projects/pose2seg/index.html
- Size: 1.92 MB
- Stars: 529
- Watchers: 20
- Forks: 136
- Open Issues: 25
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Pose2Seg
*Official* code for the paper "Pose2Seg: Detection Free Human Instance Segmentation"[[ProjectPage]](http://www.liruilong.cn/projects/pose2seg/index.html)[[arXiv]](https://arxiv.org/abs/1803.10683) @ CVPR2019.
The *OCHuman dataset* proposed in our paper is released [here](https://github.com/liruilong940607/OCHumanApi)
Pipeline of our pose-based instance segmentation framework.
## Setup environment
``` bash
pip install cython matplotlib tqdm opencv-python scipy pyyaml numpy
pip install torchvision torchcd ~/github-public/cocoapi/PythonAPI/
python setup.py build_ext install
cd -
```## Download data
- COCO 2017
- [COCO 2017 Train images [118K/18GB]](http://images.cocodataset.org/zips/train2017.zip)
- [COCO 2017 Val images [5K/1GB]](http://images.cocodataset.org/zips/val2017.zip)
- [COCOPersons Train Annotation (person_keypoints_train2017_pose2seg.json) [166MB]](https://github.com/liruilong940607/Pose2Seg/releases/download/data/person_keypoints_train2017_pose2seg.json)
- [COCOPersons Val Annotation (person_keypoints_val2017_pose2seg.json) [7MB]](https://github.com/liruilong940607/Pose2Seg/releases/download/data/person_keypoints_val2017_pose2seg.json)
- OCHuman
- [images [667MB] & annotations](https://cg.cs.tsinghua.edu.cn/dataset/form.html?dataset=ochuman)
**Note**:
`person_keypoints_(train/val)2017_pose2seg.json` is a subset of `person_keypoints_(train/val)2017.json` (in [COCO2017 Train/Val annotations](http://images.cocodataset.org/annotations/annotations_trainval2017.zip)). We choose those instances with both keypoint and segmentation annotations for our experiments.## Setup data
The `data` folder should be like this:
data
├── coco2017
│ ├── annotations
│ │ ├── person_keypoints_train2017_pose2seg.json
│ │ ├── person_keypoints_val2017_pose2seg.json
│ ├── train2017
│ │ ├── ####.jpg
│ ├── val2017
│ │ ├── ####.jpg
├── OCHuman
│ ├── annotations
│ │ ├── ochuman_coco_format_test_range_0.00_1.00.json
│ │ ├── ochuman_coco_format_val_range_0.00_1.00.json
│ ├── images
│ │ ├── ####.jpg## How to train
``` bash
python train.py
```**Note**: Currently we only support for single-gpu training.
## How to test
This allows you to test the model on (1) COCOPersons val set and (2) OCHuman val & test set.
``` bash
python test.py --weights last.pkl --coco --OCHuman
```We retrained our model using this repo, and got similar results with our paper. The final weights can be download [here](https://drive.google.com/file/d/193i8b40MJFxawcJoNLq1sG0vhAeLoVJG/view?usp=sharing).
## About Human Pose Templates in COCO
Pose templates clustered using K-means on COCO.
This repo already contains a template file `modeling/templates.json` which was used in our paper. But you are free to explore different cluster parameters as discussed in our paper. See [visualize_cluster.ipynb](visualize_cluster.ipynb) for an example.