Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://zhifanzhu.github.io/getagrip/
https://zhifanzhu.github.io/getagrip/
Last synced: 2 months ago
JSON representation
- Host: GitHub
- URL: https://zhifanzhu.github.io/getagrip/
- Owner: zhifanzhu
- Created: 2023-12-22T17:18:37.000Z (about 1 year ago)
- Default Branch: dev
- Last Pushed: 2024-06-05T11:44:09.000Z (7 months ago)
- Last Synced: 2024-08-03T23:13:41.432Z (5 months ago)
- Language: Python
- Size: 93 MB
- Stars: 19
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- arctic - Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos
README
# Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos
[[Project Page]](https://zhifanzhu.github.io/getagrip) [[arxiv]](https://arxiv.org/abs/2312.15719)
## The EPIC-Grasps annotation
The essential annotation of the EPIC-Grasps dataset is the (start, end) information of each labelled grasp.
This is the file [code_epichor/image_sets/epichor_round3_2447valid_nonempty.csv](https://github.com/zhifanzhu/getagrip/blob/dev/code_epichor/image_sets/epichor_round3_2447valid_nonempty.csv).
The important fields in the file are:
- `vid`: the video ID
- `st`: the start frame of the grasp
- `et`: the end frame of the grasp
- `cat`: the object category
- `handside`: either "left hand" or "right hand"Additionally:
- `fmt`: The prefix of output, e.g. 'bottle/P01_14_left_hand_57890_57947_*'### Stable grasps on ARCTIC and HOI4D
The automatic extracted stable grasps on ARCTIC and HOI4D are in the files `code_arctic/image_sets/stable_grasps_v3_frag_valid_min20.csv` and
`code_hoi4d/image_sets/stable_grasps_0.3_0.01.csv`, respectively.## Installation
```
conda create --name getagrip-env python=3.8
conda activate getagrip-env
pip -r requirements.txt
sh scripts_sh/install_third_party.sh
```This repo has been tested on:
- Ubuntu 22.04, GTX 1080Ti, CUDA 12.2, python 3.8.13, torch 1.8.1+cu102
- Ubuntu 22.04, RTX 4090, CUDA 12.2, python 3.10.13, torch 2.0.0+cu118### Setup MANO
Source: https://github.com/JudyYe/ihoi/blob/main/docs/install.md
- Download MANO Model (Neutral model: MANO_LEFT.pkl, MANO_RIGHT.pkl):
- Download ```Models & Code``` in the original [MANO website](https://mano.is.tue.mpg.de/). You need to register to download the MANO data.
- Put the ```models/MANO_LEFT.pkl``` ```models/MANO_RIGHT.pkl``` file in: `./externals/mano/`## Download data
Link: [Google Drive](https://drive.google.com/file/d/16hPs40gsxFI7fPURHBPw5COApumoLvke/view?usp=sharing)
Place the `epicgrasps_storage/` directory in the root of the repository, i.e. the same level as this README.
This includes EPIC-Kitchens images and VISOR masks, if you don't yet have them.
To save space, only the method input, i.e. sampled 30 frames per sequence, are included in the link.
This is the minimal data to reproduce the results in the paper.## Run
To run one sequence, e.g. fmt=bottle/P01_14_left_hand_57890_57947 (which is line 189 in the annotation csv), run:
```
python temporal/run_fit_mvho.py \
--config-dir=config/epichor \
--config-name=mvho_hamer_xxxx \
hydra.run.dir=outputs/demo_out \
homan.version=lowdim \
optim_mv.num_inits_parallel=5 \
debug_locate=P01_14_left_hand_57890_57947
```
See `example_outputs/demo_out/` for the above command.To run all sequences, run:
```
python temporal/run_fit_mvho.py \
--config-dir=config/epichor \
--config-name=mvho_hamer_xxxx \
hydra.run.dir=outputs/demo_out \
homan.version=lowdim \
optim_mv.num_inits_parallel=5
```## Inspecting the results
The quantitative result of each sequence is saved into `*_metrics.csv` file,
e.g. P01_14_left_hand_57890_57947_metrics.csv.
Each csv contains `num_init_poses` rows.
The fields `oious` and `avg_sca` correspond to the **IOU** and **SCA** in the table-3 in the paper.
These two fields should be an indicator of the quality of the result.## Notes
- FrankMocap related won't work as not installed.
- The provided input data contains HaMeR results, but the procedure of running HaMeR is not included in this repo (yet). Do refer to [HaMeR](https://github.com/geopavlakos/hamer) if you use this code.## Acknowledgements
Much of this repository are based on [HOMan](https://github.com/hassony2/homan) and [IHOI](https://github.com/JudyYe/ihoi).