Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/scofield7419/Video-of-Thought

Codes for ICML 2024 paper: "Video-of-Thought: Step-by-Step Video Reasoning from Perception to Cognition"
https://github.com/scofield7419/Video-of-Thought

chain-of-thought chain-of-thought-reasoning multimodal-large-language-models video video-model video-reasoning

Last synced: 29 days ago
JSON representation

Codes for ICML 2024 paper: "Video-of-Thought: Step-by-Step Video Reasoning from Perception to Cognition"

Awesome Lists containing this project

README

        

## 🤔🎞️ Video-of-Thought: Step-by-Step Video Reasoning from Perception to Cognition

pytorch 1.8.1


pytorch 1.8.1


Build Status


pytorch 1.8.1

**The implementation of the ICML 2024 paper [Video-of-Thought: Step-by-Step Video Reasoning from Perception to Cognition](https://is.gd/fcfZeO)**

----------
### 🎉 Visit the project page: [VoT](http://haofei.vip/VoT/)

----------

## Overview

> The first video Chain-of-Thought reasoning framework, VoT, which decomposes raw complex problems into a chain of sub-problems, and reasons through
multiple steps from low to high levels, enabling not only pixel perceptive recognition but also semantic
cognitive understanding of videos.



> We also introduce a novel video MLLM, namely MotionEpic, which supports not only video input but also the encoding, understanding and generation of STSGs.



----------

## Code

(TBD)

----------

## Citation

If you use this work, please kindly cite:

```
@inproceedings{VoT24Hao,
author = {Hao Fei, Shengqiong Wu, Wei Ji, Hanwang Zhang, Meishan Zhang, Mong-Li Lee, Wynne Hsu},
title = {Video-of-Thought: Step-by-Step Video Reasoning from Perception to Cognition},
journal = {Proceedings of the International Conference on Machine Learning (ICML)},
year = {2024},
}
```

----------
### License

The code is released under Apache License 2.0 for Noncommercial use only.

----------

### Contact

For any questions, feel free to contact [Hao Fei](mailto:[email protected]).