An open API service indexing awesome lists of open source software.

https://github.com/zyc-cver/m2beats-2.0

This is the official code implementation for 《M2Beats 2.0: When Motion Meets Beats in Short-form Videos Twice》. More details will be released once the paper is published!
https://github.com/zyc-cver/m2beats-2.0

motion music rhythm

Last synced: 1 day ago
JSON representation

This is the official code implementation for 《M2Beats 2.0: When Motion Meets Beats in Short-form Videos Twice》. More details will be released once the paper is published!

Awesome Lists containing this project

README

          


M2Beats 2.0: When Motion Meets Beats in Short-form Videos Twice



Project

## Installation

### 1. Create Conda Environment
```bash
conda create -n m2beats2 python=3.9 -y
conda activate m2beats2
```

### 2. Install Required Packages
```bash
pip install numpy==1.24.0 torch==1.12.1
```

## Model
Download our pretrained model from [Google Drive](https://drive.google.com/drive/folders/1DsTspcqAeyAo-SJl2rfyV4k_PWm3Haer?usp=sharing).

## Dataset
Download our dataset from [Google Drive](https://drive.google.com/file/d/1DkUsyZ58WW3TMauFbW7tggET46uFjP_g/view?usp=sharing).

The dataset follows the directory structure below:
```bash
AIST_M2B_2/
├── data_split/
├── test_data_split.csv
├── train_data_split.csv
```
Original video and motion data can be downloaded from [AIST++](https://google.github.io/aistplusplus_dataset/factsfigures.html)

## Evaluation
Run the following command to evaluate the model:
```bash
python eval.py --checkpoint checkpoints/params.pt --test AIST_M2B_2/train_data_split.csv --test_data AIST_M2B_2/data_split
```

## Test
### Testing on Sample Data from the Dataset
```bash
python test.py --checkpoint checkpoints/params.pt --test AIST_M2B_2/train_data_split.csv --test_data AIST_M2B_2/data_split
```

### Testing on Your Own Data
If you want to test on your own data, you need to extract 2D human keypoints from video data beforehand. We recommend using [mmpose](https://github.com/open-mmlab/mmpose).

---

More details will be available once our paper is published!