{"id":18972514,"url":"https://github.com/junweiliang/multiverse","last_synced_at":"2025-04-09T11:13:22.923Z","repository":{"id":41368513,"uuid":"229499696","full_name":"JunweiLiang/Multiverse","owner":"JunweiLiang","description":"Dataset, code and model for the CVPR'20 paper \"The Garden of Forking Paths: Towards Multi-Future Trajectory Prediction\". And for the ECCV'20 SimAug paper.","archived":false,"fork":false,"pushed_at":"2023-02-12T03:46:17.000Z","size":86045,"stargazers_count":257,"open_issues_count":1,"forks_count":64,"subscribers_count":8,"default_branch":"master","last_synced_at":"2025-04-02T09:08:22.289Z","etag":null,"topics":["3d-simulation","computer-vision","trajectory-prediction","trajectory-prediction-benchmark","video-understanding"],"latest_commit_sha":null,"homepage":"https://next.cs.cmu.edu/multiverse/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/JunweiLiang.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-12-22T00:27:36.000Z","updated_at":"2025-03-19T05:31:22.000Z","dependencies_parsed_at":"2024-12-25T09:21:03.940Z","dependency_job_id":null,"html_url":"https://github.com/JunweiLiang/Multiverse","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/JunweiLiang%2FMultiverse","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/JunweiLiang%2FMultiverse/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/JunweiLiang%2FMultiverse/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/JunweiLiang%2FMultiverse/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/JunweiLiang","download_url":"https://codeload.github.com/JunweiLiang/Multiverse/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248027412,"owners_count":21035594,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["3d-simulation","computer-vision","trajectory-prediction","trajectory-prediction-benchmark","video-understanding"],"created_at":"2024-11-08T15:08:59.220Z","updated_at":"2025-04-09T11:13:22.904Z","avatar_url":"https://github.com/JunweiLiang.png","language":"Python","readme":"# Multiverse\n\nThis repository contains the code and models for the following CVPR'20 paper:\n\n**[The Garden of Forking Paths: Towards Multi-Future Trajectory Prediction](https://arxiv.org/abs/1912.06445)** \\\n[Junwei Liang](https://www.cs.cmu.edu/~junweil/),\n[Lu Jiang](http://www.lujiang.info/),\n[Kevin Murphy](https://www.cs.ubc.ca/~murphyk/),\n[Ting Yu](https://scholar.google.com/citations?user=_lswGcYAAAAJ\u0026hl=en),\n[Alexander Hauptmann](https://www.cs.cmu.edu/~alex/)\n\nYou can find more information at our [Project Page](https://precognition.team/next/multiverse/) and the [blog](https://medium.com/@junweil/cvpr20-the-garden-of-forking-paths-towards-multi-future-trajectory-prediction-df23221dc9f8).\n\nThe **SimAug** (ECCV'20) project is [here](SimAug/README.md).\n\n+ *[11/2022] CMU server is down. You can replace all `https://next.cs.cmu.edu` with `https://precognition.team/next/` to download necessary resources (I did not back up the pretrained models. Please open an issue and share with me the model tgzs if you have downloaded them from the CMU server before).*\n\nIf you find this code useful in your research then please cite\n\n```\n@inproceedings{liang2020garden,\n  title={The Garden of Forking Paths: Towards Multi-Future Trajectory Prediction},\n  author={Liang, Junwei and Jiang, Lu and Murphy, Kevin and Yu, Ting and Hauptmann, Alexander},\n  booktitle={The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},\n  month = {June},\n  year={2020}\n}\n@inproceedings{liang2020simaug,\n  title={SimAug: Learning Robust Representations from Simulation for Trajectory Prediction},\n  author={Liang, Junwei and Jiang, Lu and Hauptmann, Alexander},\n  booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},\n  month = {August},\n  year={2020}\n}\n```\n\n# Introduction\n\n\u003cdiv align=\"center\"\u003e\n  \u003cdiv style=\"\"\u003e\n      \u003cimg src=\"images/prob.gif\" height=\"300px\" /\u003e\n  \u003c/div\u003e\n  \u003cp style=\"font-weight:bold;font-size:1.2em;\"\u003e\n    This paper proposes the first multi-future pedestrian trajectory prediction dataset and a multi-future prediction method called Multiverse.\n  \u003c/p\u003e\n\u003c/div\u003e\n\nThis paper studies the problem of predicting the distribution over multiple possible future paths of people as they move through various visual scenes. We make two main contributions. The first contribution is a new dataset called the **Forking Paths Dataset**, created in a realistic 3D simulator, which is based on real world trajectory data, and then extrapolated by human annotators to achieve different latent goals. This provides the **first benchmark** for quantitative evaluation of the models to predict **multi-future trajectories**.\n\n# The Forking Paths Dataset\n\n+ Current dataset version: v1\n\n+ Download links: [Google Drive](https://drive.google.com/file/d/1yESCQuIdiDNanUSX0qDyzbBRe_AeZB5a/view?usp=sharing) ([sha256sum](https://precognition.team/next/multiverse/dataset/ForkingPaths_dataset_v1.sha256sum.txt))\n/\n[Baidu Pan](https://pan.baidu.com/s/1nuc726hX8bUBXmMRj6UBJw) (提取码: tpd7)\n\n+ The dataset includes 3000 1920x1080 videos (750 human-annotated trajectory samples in 4 camera views) with bounding boxes and scene semantic segmentation ground truth. More notes and instructions about the dataset can be found [here](forking_paths_dataset/README.md#annotations).\n\n+ Instructions of how to add more human annotations, edit the scenes, recreate from real-world videos, or **just simply to play with 3D simulator**, can be found [here](forking_paths_dataset/README.md#record-more-annotations).\n\n+ Instructions of how to semi-automatically **re-create real-world videos' scenarios** with homography matrices into 3D simulation can be found [here](forking_paths_dataset/README.md#recreate-scenarios-from-real-world-videos).\n\n\u003cdiv align=\"center\"\u003e\n  \u003cdiv style=\"\"\u003e\n      \u003cimg src=\"images/multi_view_v2.gif\" height=\"210px\" /\u003e\n      \u003cimg src=\"images/multi_view2_v2.gif\" height=\"210px\" /\u003e\n  \u003c/div\u003e\n  \u003cp style=\"font-weight:bold;font-size:1.2em;\"\u003e\n    \u003ca href=\"http://www.youtube.com/watch?feature=player_embedded\u0026v=RW45YQHxIhk\" target=\"_blank\"\u003eDemo Video\u003c/a\u003e\n  \u003c/p\u003e\n\u003c/div\u003e\n\n\n# The Multiverse Model\n\n\u003cdiv align=\"center\"\u003e\n  \u003cdiv style=\"\"\u003e\n      \u003cimg src=\"images/cvpr2020_model.png\" height=\"300px\" /\u003e\n  \u003c/div\u003e\n  \u003cbr/\u003e\n\u003c/div\u003e\n\nOur second contribution is a new model to generate multiple plausible future trajectories, which contains novel designs of using multi-scale location encodings and convolutional RNNs over graphs. We refer to our model as Multiverse.\n\n\n## Dependencies\n+ Python 2/3; TensorFlow-GPU \u003e= 1.15.0\n\n## Pretrained Models\nYou can download pretrained models by running the script\n`bash scripts/download_single_models.sh`.\n\n## Testing and Visualization\nInstructions for testing pretrained models can be [found here](TESTING.md).\n\n\u003cdiv align=\"center\"\u003e\n  \u003cdiv style=\"\"\u003e\n      \u003cimg src=\"images/0400_40_256_cam2_sgan.gif\" height=\"255px\" /\u003e\n      \u003cimg src=\"images/0400_40_256_cam2_ours.gif\" height=\"255px\" /\u003e\n  \u003c/div\u003e\n  \u003cp style=\"font-weight:bold;font-size:1.2em;\"\u003e\n    Qualitative analysis between Social-GAN (left) and our model.\n  \u003c/p\u003e\n\u003c/div\u003e\n\n## Training new models\nInstructions for training new models can be [found here](TRAINING.md).\n\n## Acknowledgments\nThe Forking Paths Dataset is created based on the [CARLA Simulator](https://carla.org) and [Unreal Engine 4](https://www.unrealengine.com/en-US/).\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjunweiliang%2Fmultiverse","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fjunweiliang%2Fmultiverse","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjunweiliang%2Fmultiverse/lists"}