Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/wentaoL86/Awesome-Human-Video-Generation

A work list of recent human video generation method. This repository focus on half/full body human video generation method, The Nerf, Gaussian splashing, Motion Pose, and talking head/Portrait is not included. Our details survey is online now.
https://github.com/wentaoL86/Awesome-Human-Video-Generation

List: Awesome-Human-Video-Generation

Last synced: 3 months ago
JSON representation

A work list of recent human video generation method. This repository focus on half/full body human video generation method, The Nerf, Gaussian splashing, Motion Pose, and talking head/Portrait is not included. Our details survey is online now.

Awesome Lists containing this project

README

        

# Awesome-Human-Video-Generation
[![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/hee9joon/Awesome-Diffusion-Models)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
[![Made With Love](https://img.shields.io/badge/Made%20With-Love-red.svg)](https://github.com/chetanraj/awesome-github-badges)

#### This is a repository with our survey paper : ["A Comprehensive Survey on Human Video Generation: Challenges, Methods, and Insights"](https://arxiv.org/abs/2407.08428).

#### It is a continuously updated project to track the latest progress in human video generation.

#### We would feel quite insprised if this repository can bring you some insights.
#### If you like our project, please give us a star ⭐ on this GitHub.
#### If you have any suggestions or comments to our project or survey, please feel free to contact: [[email protected]]([email protected]).

## :collision: Highlights
- 2024.07.12: The Survey Paper is Online on [arXiv](https://arxiv.org/abs/2407.08428).
- 2024.06.5: Awesome Human Video Generation Repository Started.

## Human Body Video Demo

Here are some Demo results from latest repository/paper.
















## Citation

If you find our work useful in your research, please consider citing:
```

@article{lei2024humanvideo,
title={A Comprehensive Survey on Human Video Generation: Challenges, Methods, and Insights},
author={Wentao Lei and Jinting Wang and Fengji Ma and Guanjie Huang and Li Liu},
journal={arXiv preprint arXiv:2407.08428},
year={2024}
}
```

## Table of Contents
- [Text Guided Human Video Generation](#Text-Guided-Human-Video-Generation)
- [Audio Guided Human Video Generation](#Audio-Guided-Human-Video-Generation)
- - [Dance Video Generation](#Dance-Video-Generation)
- - [Performance Video Generation](#Performance-Video-Generation)
- - [Co-Speech Gesture Video Generation](#Co-Speech-Gesture-Video-Generation)
- [Pose Guided Human Video Generation](#Pose-Guided-Human-Video-Generation)
- [Applications](#Applications)
- [Datasets](#Datasets)

## Text Guided Human Video Generation
+ [HMTV (WACV 2024)](https://github.com/CSJasper/HMTV)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://openaccess.thecvf.com/content/WACV2024/papers/Kim_Human_Motion_Aware_Text-to-Video_Generation_With_Explicit_Camera_Control_WACV_2024_paper.pdf2)
[![Star](https://img.shields.io/github/stars/CSJasper/HMTV.svg?style=social&label=Star)](https://github.com/CSJasper/HMTV)

+ [SignLLM](https://signllm.github.io/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2405.10718v1)
[![Website](https://img.shields.io/badge/Website-9cf)](https://signllm.github.io/)

+ [Text2Performer (ICCV2023)](https://github.com/yumingj/Text2Performer)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2304.08483)
[![Star](https://img.shields.io/github/stars/yumingj/Text2Performer.svg?style=social&label=Star)](https://github.com/yumingj/Text2Performer)
[![Website](https://img.shields.io/badge/Website-9cf)](https://yumingj.github.io/projects/Text2Performer.html)

## Audio Guided Human Video Generation

### Dance Video Generation
+ [DanceIt (TIP 2021)](https://arxiv.org/pdf/2009.08027)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2009.08027)
[![Star](https://img.shields.io/github/stars/iCVTEAM/DanceIt.svg?style=social&label=Star)](https://github.com/iCVTEAM/DanceIt)
### Performance Video Generation
+ [Music2Play (CAC 2023)](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10450842)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10450842)

### Co-Speech Gesture Video Generation
+ [S2G-MDDiffusion (CVPR 2024)](https://github.com/thuhcsi/S2G-MDDiffusion)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2404.01862)
[![Star](https://img.shields.io/github/stars/thuhcsi/S2G-MDDiffusion.svg?style=social&label=Star)](https://github.com/thuhcsi/S2G-MDDiffusion)
[![Website](https://img.shields.io/badge/Website-9cf)](https://thuhcsi.github.io/S2G-MDDiffusion/)

+ [ANGIE (NeurIPS 2022)](https://github.com/alvinliu0/ANGIE)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2212.02350)
[![Star](https://img.shields.io/github/stars/alvinliu0/ANGIE.svg?style=social&label=Star)](https://github.com/alvinliu0/ANGIEn)
[![Website](https://img.shields.io/badge/Website-9cf)](https://alvinliu0.github.io/projects/ANGIE)

## Pose Guided Human Video Generation

+ [FollowYourPose](https://github.com/mayuelala/followyourpose)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2304.01186)
[![Star](https://img.shields.io/github/stars/mayuelala/followyourpose.svg?style=social&label=Star)](https://github.com/mayuelala/followyourpose)
[![Website](https://img.shields.io/badge/Website-9cf)](https://follow-your-pose.github.io/)

+ [PSGAN](https://arxiv.org/pdf/1807.11152v1)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/1807.11152v1)

+ [DwNet](https://arxiv.org/pdf/1910.09139)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/1910.091396)
[![Star](https://img.shields.io/github/stars/ubc-vision/DwNet.svg?style=social&label=Star)](https://github.com/ubc-vision/DwNet)

+ [Animateanyone](https://arxiv.org/pdf/2311.17117)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2311.17117)
[![Star](https://img.shields.io/github/stars/MooreThreads/Moore-AnimateAnyone.svg?style=social&label=Star)](https://github.com/MooreThreads/Moore-AnimateAnyone?tab=readme-ov-file#train)
[![Website](https://img.shields.io/badge/Website-9cf)](https://humanaigc.github.io/animate-anyone/)

+ [ID-Animator](https://arxiv.org/abs/2404.15275)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2404.15275)
[![Star](https://img.shields.io/github/stars/ID-Animator/ID-Animator.svg?style=social&label=Star)](https://github.com/ID-Animator/ID-Animator)
[![Website](https://img.shields.io/badge/Website-9cf)](https://id-animator.github.io/)

+ [DreaMoving](https://dreamoving.github.io/dreamoving/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2312.05107)
[![Star](https://img.shields.io/github/stars/dreamoving/dreamoving-project.svg?style=social&label=Star)](https://github.com/dreamoving/dreamoving-project)
[![Website](https://img.shields.io/badge/Website-9cf)](https://dreamoving.github.io/dreamoving/)

+ [MagicPose](https://boese0601.github.io/magicdance/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2311.12052)
[![Star](https://img.shields.io/github/stars/Boese0601/MagicDance.svg?style=social&label=Star)](https://github.com/Boese0601/MagicDance)
[![Website](https://img.shields.io/badge/Website-9cf)](https://boese0601.github.io/magicdance/)

+ [LEO](https://wyhsirius.github.io/LEO-project/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2305.03989)
[![Star](https://img.shields.io/github/stars/wyhsirius/LEO.svg?style=social&label=Star)](https://github.com/wyhsirius/LEO)
[![Website](https://img.shields.io/badge/Website-9cf)](https://wyhsirius.github.io/LEO-project/)

+ [MuseV](https://github.com/TMElyralab/MuseV)
[![Star](https://img.shields.io/github/stars/TMElyralab/MuseV.svg?style=social&label=Star)](https://github.com/TMElyralab/MuseV)
[![Website](https://img.shields.io/badge/Website-9cf)](https://tmelyralab.github.io/MuseV_Page/)

+ [MusePose](https://github.com/TMElyralab/MusePose)
[![Star](https://img.shields.io/github/stars/TMElyralab/MusePose.svg?style=social&label=Star)](https://github.com/TMElyralab/MusePose)

+ [MagicAnimate](https://showlab.github.io/magicanimate/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2311.16498)
[![Star](https://img.shields.io/github/stars/magic-research/magic-animate.svg?style=social&label=Star)](https://github.com/magic-research/magic-animate)
[![Website](https://img.shields.io/badge/Website-9cf)](https://showlab.github.io/magicanimate/)

+ [UniAnimate](https://github.com/ali-vilab/UniAnimate)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2406.01188)
[![Star](https://img.shields.io/github/stars/ali-vilab/UniAnimate.svg?style=social&label=Star)](https://github.com/ali-vilab/UniAnimate)
[![Website](https://img.shields.io/badge/Website-9cf)](https://unianimate.github.io/)

+ [Champ](https://github.com/fudan-generative-vision/champ)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2403.14781)
[![Star](https://img.shields.io/github/stars/fudan-generative-vision/champ.svg?style=social&label=Star)](https://github.com/fudan-generative-vision/champ)
[![Website](https://img.shields.io/badge/Website-9cf)](https://fudan-generative-vision.github.io/champ/#/)

+ [Magic-Me](https://github.com/zhen-dong/magic-me)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2402.09368)
[![Star](https://img.shields.io/github/stars/Zhen-Dong/Magic-Me.svg?style=social&label=Star)](https://github.com/Zhen-Dong/Magic-Me)
[![Website](https://img.shields.io/badge/Website-9cf)](https://magic-me-webpage.github.io/)

+ [Disco](https://disco-dance.github.io/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2307.00040)
[![Star](https://img.shields.io/github/stars/Wangt-CN/DisCo.svg?style=social&label=Star)](https://github.com/Wangt-CN/DisCo)
[![Website](https://img.shields.io/badge/Website-9cf)](https://disco-dance.github.io/)

+ [TwoStreamVAN](https://arxiv.org/pdf/1812.01037)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/1812.01037)
[![Star](https://img.shields.io/github/stars/sunxm2357/TwoStreamVAN.svg?style=social&label=Star)](https://github.com/sunxm2357/TwoStreamVAN)

+ [MoCoGAN](https://github.com/sergeytulyakov/mocogan)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/1707.04993)
[![Star](https://img.shields.io/github/stars/sergeytulyakov/mocogan.svg?style=social&label=Star)](https://github.com/sergeytulyakov/mocogan)

+ [Pose-Guided Human Animation from a Single Image in the Wild](https://arxiv.org/pdf/2012.03796)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2012.03796)

+ [MimicMotion](https://arxiv.org/abs/2406.19680)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2406.19680)

+ [Human4DiT](https://arxiv.org/pdf/2405.17405)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2405.17405)

+ [FollowYourPose-v2 ](https://arxiv.org/pdf/2406.03035)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/pdf/2406.03035)

+ [IDOL](https://yhzhai.github.io/idol/)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://arxiv.org/abs/2407.10937)

+ [MagicFight](https://openreview.net/pdf?id=7JhV3Pbfgk)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://openreview.net/pdf?id=7JhV3Pbfgk)

+ [DiffPerformer CVPR(2024)](https://openaccess.thecvf.com//content/CVPR2024/papers/Wang_DiffPerformer_Iterative_Learning_of_Consistent_Latent_Guidance_for_Diffusion-based_Human_CVPR_2024_paper.pdf)
[![arXiv](https://img.shields.io/badge/arXiv-b31b1b.svg)](https://openaccess.thecvf.com//content/CVPR2024/papers/Wang_DiffPerformer_Iterative_Learning_of_Consistent_Latent_Guidance_for_Diffusion-based_Human_CVPR_2024_paper.pdf)

### Applications
+ [Keling](https://kling.kuaishou.com/)
[![Website](https://img.shields.io/badge/Website-9cf)](https://kling.kuaishou.com/)

+ [Viggle](https://viggle.ai/)
[![Star](https://img.shields.io/github/stars/hoachen/veggieai-generate-video.svg?style=social&label=Star)](https://github.com/hoachen/veggieai-generate-video)
[![Website](https://img.shields.io/badge/Website-9cf)](https://viggle.ai/)

+ [Luma](https://lumalabs.ai/dream-machine)
[![Website](https://img.shields.io/badge/Website-9cf)](https://lumalabs.ai/dream-machine)

+ [Runway](https://runwayml.com/)
[![Website](https://img.shields.io/badge/Website-9cf)](https://runwayml.com/)

### Datasets
+ [Text2performer](https://drive.google.com/drive/folders/1NFd_irnw8kgNcu5KfWhRA8RZPdBK5p1I)
+ [Tiktok](https://www.kaggle.com/datasets/yasaminjafarian/tiktokdataset?resource=download)
+ [Tiktok-v4](https://drive.google.com/file/d/1jEK0YJ5AfZZuFNqGGqOtUPFx--TIebT9/view)
+ [TED gesture dataset](https://github.com/youngwoo-yoon/youtube-gesture-dataset)
+ [Bold](https://cydar.ist.psu.edu/emotionchallenge/dataset.php)
+ [DeepFashion](https://mmlab.ie.cuhk.edu.hk/projects/DeepFashion.html)
+ [Taichi](https://github.com/AliaksandrSiarohin/video-preprocessing)
+ [Ubody](https://osx-ubody.github.io/)
+ [ASTS](https://www.wisdom.weizmann.ac.il/~vision/SpaceTimeActions.html)
+ [UCF-101](https://www.crcv.ucf.edu/data/UCF101.php)
+ [human3.6m](http://vision.imar.ro/human3.6m/description.php)
+ [NTU RGB+D](https://rose1.ntu.edu.sg/dataset/actionRecognition/)
+ [HAR](https://www.kaggle.com/datasets/sharjeelmazhar/human-activity-recognition-video-dataset)
+ [3D People Synthetic](https://drive.google.com/file/d/1N9gioWnkb3ZZytmT3Nzx4VjXjHxLsVB9/view)
+ [MSP-Avatar](https://ecs.utdallas.edu/research/researchlabs/msp-lab/MSP-AVATAR.html)
+ [EverybodyDance](https://github.com/carolineec/EverybodyDanceNow)
+ [AIST++](https://google.github.io/aistplusplus_dataset/factsfigures.html)
+ [DanceIt](https://github.com/iCVTEAM/DanceIt?tab=readme-ov-file)
+ [Disco](https://drive.google.com/file/d/1N9gioWnkb3ZZytmT3Nzx4VjXjHxLsVB9/view)
+ [Sub-URMP](https://www.cs.rochester.edu/~cxu22/d/vagan/)
+ [URMP](https://labsites.rochester.edu/air/projects/URMP.html)