Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

awesome-colab-notebooks

Collection of google colaboratory notebooks for fast and easy experiments
https://github.com/amrzv/awesome-colab-notebooks

  • ![ - Studio [![](https://img.shields.io/github/stars/Artiprocher/DiffSynth-Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio)</li> <li>wilson1yan/VideoGPT [![](https://img.shields.io/github/stars/wilson1yan/VideoGPT?style=social)](https://github.com/wilson1yan/VideoGPT)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>metavoiceio/metavoice-src [![](https://img.shields.io/github/stars/metavoiceio/metavoice-src?style=social)](https://github.com/metavoiceio/metavoice-src)</li> <li>QwenLM/Qwen-VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>piddnad/DDColor [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>microsoft/xr-development-for-beginners [![](https://img.shields.io/github/stars/microsoft/xr-development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners)</li> <li>OpenTalker/video-retalking [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>nd-ball/py-irt [![](https://img.shields.io/github/stars/nd-ball/py-irt?style=social)](https://github.com/nd-ball/py-irt)</li> <li>drengskapur/colab2pdf [![](https://img.shields.io/github/stars/drengskapur/colab2pdf?style=social)](https://github.com/drengskapur/colab2pdf)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>ShishirPatil/gorilla [![](https://img.shields.io/github/stars/ShishirPatil/gorilla?style=social)](https://github.com/ShishirPatil/gorilla)</li> <li>wenquanlu/HandRefiner [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner)</li> <li>facebookresearch/home-robot [![](https://img.shields.io/github/stars/facebookresearch/home-robot?style=social)](https://github.com/facebookresearch/home-robot)</li> <li>guoyww/animatediff [![](https://img.shields.io/github/stars/guoyww/animatediff?style=social)](https://github.com/guoyww/animatediff)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>mistralai/mistral-src [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src)</li> <li>TencentARC/PhotoMaker [![](https://img.shields.io/github/stars/TencentARC/PhotoMaker?style=social)](https://github.com/TencentARC/PhotoMaker)</li> <li>Stability-AI/StableCascade [![](https://img.shields.io/github/stars/Stability-AI/StableCascade?style=social)](https://github.com/Stability-AI/StableCascade)</li> <li>Algolzw/daclip-uir [![](https://img.shields.io/github/stars/Algolzw/daclip-uir?style=social)](https://github.com/Algolzw/daclip-uir)</li> <li>rohitgandikota/sliders [![](https://img.shields.io/github/stars/rohitgandikota/sliders?style=social)](https://github.com/rohitgandikota/sliders)</li> <li>sczhou/CodeFormer [![](https://img.shields.io/github/stars/sczhou/CodeFormer?style=social)](https://github.com/sczhou/CodeFormer)</li> <li>damian0815/compel [![](https://img.shields.io/github/stars/damian0815/compel?style=social)](https://github.com/damian0815/compel)</li></ul> | <ul><li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>SkyAR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>GHOST [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ACCESS.2022.3196668)](https://doi.org/10.1109/ACCESS.2022.3196668)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>py-irt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>PyTerrier [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3459637.3482013)](https://doi.org/10.1145/3459637.3482013)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>Motion Supervised co-part Segmentation [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520)</li> <li>VToonify [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555437)](https://doi.org/10.1145/3550454.3555437)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01202)](https://doi.org/10.1109/CVPR46437.2021.01202)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>LaSAFT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>SimSwap [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3394171.3413630)](https://doi.org/10.1145/3394171.3413630)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>MSG-Net [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-11018-5_32)](https://doi.org/10.1007/978-3-030-11018-5_32)</li> <li>Geometry-Free View Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.01409)](https://doi.org/10.1109/ICCV48922.2021.01409)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>AnimeGANv2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>FGVC [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li> <li>AlphaFold [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li> <li>NeRViS [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00230)](https://doi.org/10.1109/ICCV48922.2021.00230)</li> <li>EfficientDet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.01079)](https://doi.org/10.1109/CVPR42600.2020.01079)</li></ul> |
  • Pablo Pernias - christopher-j)</li> <li>[Marc Aubreville](https://lme.tf.fau.de/person/aubreville/)</li></ul> | [![](https://img.shields.io/github/stars/dome272/wuerstchen?style=social)](https://github.com/dome272/wuerstchen) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.00637)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/wuerstchen)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/16hsklt/w%C3%BCrstchen_is_here_a_game_changing_fastest/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ogJsCPqgFMk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/dome272/Wuerstchen/blob/main/w%C3%BCrstchen-stage-C.ipynb) | 30.03.2024 |
  • Vage Egiazarian - directory), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T-Sample), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/Vahe1994/AQLM)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/LearningMachines/comments/1atvrnl/240106118_extreme_compression_of_large_language/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qx8PNk4OkUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hAHBKAXO-88)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Vahe1994/AQLM/blob/main/notebooks/colab_example.ipynb) | 08.03.2024 |
  • Ming Zhong - us/research/people/shuowa/)</li> <li>[Yadong Lu](https://adamlu123.github.io/)</li><details><summary>others</summary><li>[Yizhu Jiao](https://yzjiao.github.io/)</li> <li>[Siru Ouyang](https://ozyyshr.github.io/)</li> <li>[Donghan Yu](https://plusross.github.io/)</li> <li>[Jiawei Han](https://hanj.cs.illinois.edu/)</li> <li>[Weizhu Chen](https://www.microsoft.com/en-us/research/people/wzchen/)</li></ul></details> | [![](https://img.shields.io/github/stars/maszhongming/Multi-LoRA-Composition?style=social)](https://github.com/maszhongming/Multi-LoRA-Composition) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2402.16843)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@letscodeai/multi-lora-composition-for-image-generation-f2706528c590)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/ninjasaid13/comments/1b13q8s/multilora_composition_for_image_generation/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/MingZhong_/status/1762347881812443575?s=20)</li><li>[website](https://maszhongming.github.io/Multi-LoRA-Composition/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1eSTj6qGOtSY5NaazwwN3meXOzEZxgaZq) | 03.03.2024 |
  • John Jumper - freiburg.de/people/ronneber/)</li> <li>[Kathryn Tunyasuvunakool](https://scholar.google.com/citations?user=eEqNGagAAAAJ)</li> <li>[Russ Bates](https://scholar.google.com/citations?user=Koes5ewAAAAJ)</li> <li>[Augustin Žídek](https://augustin.zidek.eu/)</li> <li>[Anna Potapenko](http://apotapenko.com/)</li> <li>[Alex Bridgland](https://scholar.google.com/citations?user=VWmXKPMAAAAJ)</li> <li>[Clemens Meyer](https://scholar.google.com/citations?user=EWLZiM8AAAAJ)</li> <li>[Simon Kohl](https://www.simonkohl.com/)</li> <li>[Andrew Ballard](https://scholar.google.com/citations?user=syjQhAMAAAAJ)</li> <li>[Bernardino Romera-Paredes](https://sites.google.com/site/romeraparedes/)</li> <li>[Stanislav Nikolov](https://scholar.google.co.uk/citations?user=O-b7pBEAAAAJ)</li> <li>[Rishub Jain](http://rishub.me/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2) [![](https://img.shields.io/github/stars/deepmind/alphafold?style=social)](https://github.com/deepmind/alphafold/) <ul><li>[blog post](https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology), [blog post](https://deepmind.com/blog/article/putting-the-power-of-alphafold-into-the-worlds-hands)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/tree), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/chex)</li><li>[paper](https://www.nature.com/articles/s41586-021-03828-1)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/alphafold)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/AlphaFold)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=gg7WjuFs8F4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=B9PL__gVxLI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/alphafold/blob/master/notebooks/AlphaFold.ipynb) | 29.02.2024 |
  • Nathalie Pochet - gevaert)</li> <li>[Mohsen Nabian](https://github.com/monabiyan)</li> <li>[Jayendra Shinde](https://jayendrashinde91.github.io/)</li><details><summary>others</summary><li>[Celine Everaert](http://www.crig.ugent.be/en/node/510)</li> <li>[Thorin Tabor](http://thorin.tabcreations.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/gevaertlab/AMARETTO?style=social)](https://github.com/gevaertlab/AMARETTO) <ul><li>[bioconductor](https://bioconductor.org/packages/release/bioc/html/AMARETTO.html)</li><li>[project](http://portals.broadinstitute.org/pochetlab/amaretto.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JfnRoNgTVX_7VEGAAmjGjwP_yX2tdDxs) | 28.02.2024 |
  • Carl Doersch - 7_cAAAAJ)</li> <li>[Andrew Zisserman](https://www.robots.ox.ac.uk/~az/)</li></ul></details> | [![](https://img.shields.io/github/stars/google-deepmind/tapnet?style=social)](https://github.com/google-deepmind/tapnet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.08637), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.15975)</li><li>[blog post](https://deepmind-tapir.github.io/), [blog post](https://deepmind-tapir.github.io/blogpost.html)</li><li>[<img src="images/deepmind.svg" alt="deepmind" height=20/>](https://www.deepmind.com/open-source/kinetics)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/kubric/tree/main/challenges/point_tracking)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@jumabek4044/what-is-tapir-tracking-any-point-with-per-frame-initialization-and-temporal-refinement-and-how-it-bdad9946dc53)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2022/hash/58168e8a92994655d6da3939e7cc0918-Abstract-Datasets_and_Benchmarks.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2HSHofqoJ9M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/I1DQJH3v7Nk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/tapnet/blob/master/colabs/causal_tapir_demo.ipynb) | 08.02.2024 |
  • Victor Dibia - badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11) [![](https://img.shields.io/github/stars/microsoft/lida?style=social)](https://github.com/microsoft/lida) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.02927)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/victordibia/llmx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lida-project/lida-streamlit)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@c17hawke/lida-automatically-generate-visualization-and-with-llms-the-future-of-data-visualization-6bc556876b46)</li><li>[project](https://microsoft.github.io/lida/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/exYi9W-dhME), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U9K1Cu45nMQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6xcCwlDx6f8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/lida/blob/main/notebooks/tutorial.ipynb) | 06.02.2024 |
  • Alexey Dosovitskiy - research/vision_transformer?style=social)](https://github.com/google-research/vision_transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11929), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.10270), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01548), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.07991), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.08065)</li><li>[blog post](https://blog.research.google/2022/04/locked-image-tuning-adding-language.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/flaxformer)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/models)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@weiwen21/an-image-is-worth-16x16-words-transformers-for-image-recognition-at-scale-957f88e53726)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TrdevFK_am4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HZ4j_U3FC94), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7K4Z8RqjWIk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oDtcobGQ7xU?si=C2EgZTESzhTXFSq6), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/v6xj_DG-UEo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/vision_transformer/blob/main/vit_jax.ipynb) | 06.02.2024 |
  • Alexander Mathis - badge.php?doi=10.1038/s41593-018-0209-y)](https://doi.org/10.1038/s41593-018-0209-y) [![](https://img.shields.io/github/stars/DeepLabCut/DeepLabCut?style=social)](https://github.com/DeepLabCut/DeepLabCut) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1605.03170), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1804.03142), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.11229), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2009.00564), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.13868), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.13868)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/deeplabcut/deeplabcut)</li><li>[forum](https://forum.image.sc/tag/deeplabcut)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DeepLabCut/DLCutils), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DeepLabCut/DeepLabCut-Workshop-Materials)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@cziscience/how-open-source-software-contributors-are-accelerating-biomedicine-1a5f50f6846a)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/DeepLabCut)</li><li>[website](https://www.deeplabcut.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/@deeplabcut7702), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uWZu3rnj-kQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Teb5r2TNAYs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/DeepLabCut/DeepLabCut/blob/master/examples/COLAB/COLAB_maDLC_TrainNetwork_VideoAnalysis.ipynb) | 02.02.2024 |
  • Manuel Romero - badge.php?doi=10.1145/3355089.3356528)](https://doi.org/10.1145/3355089.3356528) [![](https://img.shields.io/github/stars/sniklaus/3d-ken-burns?style=social)](https://github.com/sniklaus/3d-ken-burns) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.05483)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=WrajxHHfRBA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mrm8488/shared_colab_notebooks/blob/master/3D_Ken_Burns.ipynb) | 24.01.2024 |
  • Ziqiang Zhang - zhou.github.io/)</li> <li>[Chengyi Wang](https://cywang97.github.io/)</li> <li>[Sanyuan Chen](https://sanyuan-chen.github.io/)</li><details><summary>others</summary><li>[Yu Wu](https://www.microsoft.com/en-us/research/people/yuwu1/)</li> <li>[Shujie Liu](https://www.microsoft.com/en-us/research/people/shujliu/)</li> <li>[Zhuo Chen](https://www.microsoft.com/en-us/research/people/zhuc/)</li> <li>[Yanqing Liu](https://scholar.google.com/citations?user=dIJFz4UAAAAJ)</li> <li>[Huaming Wang](https://scholar.google.com/citations?user=aJDLg5IAAAAJ)</li> <li>[Jinyu Li](https://www.microsoft.com/en-us/research/people/jinyli/)</li> <li>[Lei He](https://scholar.google.com/citations?user=EKl9yY8AAAAJ)</li> <li>[Sheng Zhao](https://scholar.google.com/citations?user=689bIIwAAAAJ)</li> <li>[Furu Wei](https://www.microsoft.com/en-us/research/people/fuwei/)</li></ul></details> | [![](https://img.shields.io/github/stars/Plachtaa/VALL-E-X?style=social)](https://github.com/Plachtaa/VALL-E-X) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.03926), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.03143)</li><li>[demo](https://plachtaa.github.io/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/qCBRmAnTxg)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lifeiteng/vall-e)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Plachta/VALL-E-X)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/speak-a-foreign-language-in-your-own-voice-1dafa42f78d9)</li><li>[project](https://www.microsoft.com/en-us/research/project/vall-e-x)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7qgfoVFQmvk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1yyD_sz531QntLKowMHo-XxorsFBCfKul) | 19.01.2024 |
  • Zhen Li - Ming Cheng](https://mmcheng.net/cmm/)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/TencentARC/PhotoMaker?style=social)](https://github.com/TencentARC/PhotoMaker) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2312.04461)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/bmaltais/PhotoMaker), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/sdbds/PhotoMaker-for-windows), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ZHO-ZHO-ZHO/ComfyUI-PhotoMaker), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mit-han-lab/fastcomposer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TencentARC/T2I-Adapter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tencent-ailab/IP-Adapter)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/TencentARC/PhotoMaker)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@christopheverdier/photomaker-the-art-of-ai-consistent-characters-generation-cf2cd037bc3e)</li><li>[project](https://photo-maker.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/197bfj9/tencentarc_releases_photomaker/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NWIdzTEk5O4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZTck128jfFY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/TencentARC/PhotoMaker/blob/main/photomaker_demo.ipynb) | 18.01.2024 |
  • Xiaoyang Kang - Ouyang/)</li> <li>[Peiran Ren](https://scholar.google.com/citations?user=x5dEuxsAAAAJ)</li><details><summary>others</summary><li>[Lingzhi Li](https://lingzhili.com/)</li> <li>[Xuansong Xie](https://github.com/xungie)</li></ul></details> | [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.11613)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jixiaozhong/ColorFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/KIMGEONUNG/BigColor)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DDColor-colab/blob/main/DDColor_colab.ipynb) | 15.01.2024 |
  • Tao Yang - upscaler-for-automatic1111)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/runwayml/stable-diffusion-v1-5), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/nitrosocke/mo-di-diffusion)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/18qxe5q/pixelaware_stable_diffusion_for_realistic_image/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1lZ_-rSGcmreLCiRniVT973x6JLjFiC-b) | 12.01.2024 |
  • Wenquan Lu - chaoyue.github.io/)</li> <li>[Dacheng Tao](https://scholar.google.com/citations?user=RwlJNLcAAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.17957)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fannovel16/comfyui_controlnet_aux), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mikubill/sd-webui-controlnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/MeshGraphormer)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1881z4v/handrefiner_refining_malformed_hands_in_generated/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Tt-Fyn1RA6c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/HandRefiner-colab/blob/main/HandRefiner_colab.ipynb) | 08.01.2024 |
  • Evonne Ng - 3T3LaO3nlN6R8s6pPvVNAk5mdK) | 08.01.2024 |
  • Remi Lam - Gonzalez](https://github.com/alvarosg)</li> <li>[Matthew Willson](https://github.com/mjwillson)</li> <li>[Peter Wirnsberger](https://pewi.org/)</li><details><summary>others</summary><li>[Meire Fortunato](https://scholar.google.com/citations?user=_fMHSIUAAAAJ)</li> <li>[Ferran Alet](https://scholar.google.com/citations?user=1lmBq3QAAAAJ)</li> <li>[Suman Ravuri](https://www.linkedin.com/in/suman-ravuri-81928082)</li> <li>[Timo Ewalds](https://github.com/tewalds)</li> <li>[Zach Eaton-Rosen](https://scholar.google.com/citations?user=mQ3zD_wAAAAJ)</li> <li>[Weihua Hu](https://weihua916.github.io/)</li> <li>[Alexander Merose](https://alex.merose.com/)</li> <li>[Stephan Hoyer](https://stephanhoyer.com/)</li> <li>[George Holland](https://www.linkedin.com/in/g-aracil-holland)</li> <li>[Oriol Vinyals](https://research.google/people/oriol-vinyals/)</li> <li>[Jacklynn Stott](https://linkedin.com/in/jacklynnstott)</li> <li>[Alexander Pritzel](https://github.com/a-pritzel)</li> <li>[Shakir Mohamed](https://www.shakirm.com/)</li> <li>[Peter Battaglia](https://scholar.google.com/citations?user=nQ7Ij30AAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336) [![](https://img.shields.io/github/stars/google-deepmind/graphcast?style=social)](https://github.com/google-deepmind/graphcast) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.12794)</li><li>[data](https://www.ecmwf.int/en/forecasts/datasets/reanalysis-datasets/era5)</li><li>[<img src="images/deepmind.svg" alt="deepmind" height=20/>](https://deepmind.google/discover/blog/graphcast-ai-model-for-faster-and-more-accurate-global-weather-forecasting/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/chex), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dask/dask), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/jaxline), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/tree), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mikedh/trimesh)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/graphcast-how-to-get-things-done-f2fd5630c5fb)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BufUW7h9TB8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PD1v5PCJs_o), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Eul-JN9Nwb0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BTyhgp9Hugc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/aJ_H4exg0xU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/graphcast/blob/master/graphcast_demo.ipynb) | 04.01.2024 |
  • Zeming Lin - zhu-03a27424)</li><details><summary>others</summary><li>[Allan dos Santos Costa](https://scholar.google.com/citations?user=Zb4RsFsAAAAJ)</li> <li>[Maryam Fazel-Zarandi](https://www.maryamfazel.com/)</li> <li>[Tom Sercu](https://tom.sercu.me/)</li> <li>[Salvatore Candido](https://scholar.google.com/citations?user=BDgbhmEAAAAJ)</li> <li>[Alexander Rives](https://scholar.google.com/citations?user=vqb78-gAAAAJ)</li> <li>[Joshua Meier](https://scholar.google.com/citations?user=2M0OltAAAAAJ)</li> <li>[Robert Verkuil](https://dblp.org/pid/296/8930.html)</li> <li>[Jason Liu](https://www.linkedin.com/in/liujiayi/)</li> <li>[Chloe Hsu](https://chloe-hsu.com/)</li> <li>[Adam Lerer](https://scholar.google.com/citations?user=Ad6O4-0AAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1101/622803)](https://doi.org/10.1101/622803) [![](https://img.shields.io/github/stars/facebookresearch/esm?style=social)](https://github.com/facebookresearch/esm) <ul><li>[ESM Atlas](https://esmatlas.com/)</li><li>[FSDP](https://fairscale.readthedocs.io/en/stable/api/nn/fsdp.html)</li><li>[ICML](https://proceedings.mlr.press/v139/rao21a.html)</li><li>[data](https://ftp.uniprot.org/pub/databases/uniprot/previous_releases/release-2018_03/uniref/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sokrypton/ColabFold)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/esm)</li><li>[paper](https://doi.org/10.1101/2022.07.20.500902), [paper](https://doi.org/10.1101/2021.07.09.450648), [paper](https://doi.org/10.1101/2022.04.10.487779), [paper](https://doi.org/10.1101/2022.12.21.521521)</li><li>[pubmed](https://pubmed.ncbi.nlm.nih.gov/33876751/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/N-eisTvUYrk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GHoE4VkDehY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sokrypton/ColabFold/blob/main/ESMFold.ipynb) | 28.12.2023 |
  • Nikita Karaev - graham/)</li> <li>[Natalia Neverova](https://nneverova.github.io/)</li><details><summary>others</summary><li>[Andrea Vedaldi](https://www.robots.ox.ac.uk/~vedaldi/)</li> <li>[Christian Rupprecht](https://chrirupp.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/co-tracker?style=social)](https://github.com/facebookresearch/co-tracker) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2307.07635), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.11898)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/benjiebob/BADJA)</li><li>[project](https://co-tracker.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w5QVc7BVGPA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/co-tracker/blob/main/notebooks/demo.ipynb) | 28.12.2023 |
  • Haotian Liu - li.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.08485), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.03744), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.00890), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.09958), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.14895)</li><li>[demo](https://llava.hliu.cc/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp/pull/3436), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/LLaVA-Med), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Luodian/Otter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/liuhaotian/LLaVA-Pretrain), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/liuhaotian/LLaVA-Pretrained-Projectors)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://xthemadgenius.medium.com/how-to-use-llava-large-language-and-vision-assistant-732c666b5ed0)</li><li>[project](https://llava-vl.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mkI7EPD1vp8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kx1VpI6JzsY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RxBSmbdJ1I8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mdYycY4lsuE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/t7I46dxfmWs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KRAQkJC-XJU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/LLaVA-colab/blob/main/LLaVA_13b_4bit_vanilla_colab.ipynb) | 22.12.2023 |
  • Shanchuan Lin - Shlizerman](https://www.irakemelmacher.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00865)](https://doi.org/10.1109/CVPR46437.2021.00865) [![](https://img.shields.io/github/stars/PeterL1n/BackgroundMattingV2?style=social)](https://github.com/PeterL1n/BackgroundMattingV2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.07810)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/senguptaumd/Background-Matting), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/andreyryabtsev/BGMv2-webcam-plugin-linux)</li><li>[project](https://grail.cs.washington.edu/projects/background-matting-v2/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oMfPTeYDF9g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/b7ps21MVyTA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1cTxFq1YuoJ5QPqaTcnskwlHDolnjBkB9) | 22.12.2023 |
  • Bernhard Kerbl - inf.mpg.de/~tleimkue/)</li> <li>[George Drettakis](http://www-sop.inria.fr/members/George.Drettakis/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433) [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.04079)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/camenduru/gaussian-splatting)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/axinc-ai/3d-gaussian-splatting-real-time-rendering-of-photorealistic-scenes-f7f1a47f060)</li><li>[project](https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/singularity/comments/163jeqa/3d_gaussian_splatting_for_realtime_radiance_field/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/T_kXY43VZnk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/UXtuigy_wYc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HVv_IQKlafQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w43KV79LsFw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TLK3TDDcJFU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kShNYOuDnlI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/juRMRej2d5c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/gaussian-splatting-colab/blob/main/gaussian_splatting_colab.ipynb) | 19.12.2023 |
  • Xinyin Ma - colab/blob/main/DeepCache_colab.ipynb) | 18.12.2023 |
  • Zhongcong Xu - CYYAAAAJ)</li> <li>[Hanshu Yan](https://hanshuyan.github.io/)</li><details><summary>others</summary><li>[Jiawei Liu](https://jia-wei-liu.github.io/)</li> <li>[Chenxu Zhang](https://zhangchenxu528.github.io/)</li> <li>[Jiashi Feng](https://sites.google.com/site/jshfeng/home)</li> <li>[Mike Shou](https://sites.google.com/view/showlab)</li></ul></details> | [![](https://img.shields.io/github/stars/magic-research/magic-animate?style=social)](https://github.com/magic-research/magic-animate) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.16498)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/zcxu-eric/MagicAnimate), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/runwayml/stable-diffusion-v1-5), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/sd-vae-ft-mse)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@AIWorldBlog/revolutionizing-image-animation-with-magicanimate-technology-78cc94151915)</li><li>[project](https://showlab.github.io/magicanimate/)</li><li>[website](https://www.magicanimate.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/td27SyA9M80), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1pATjLFvNtY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HeXknItbMM8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/MagicAnimate-colab/blob/main/MagicAnimate_colab.ipynb) | 18.12.2023 |
  • Xinqi Lin - of-diffusers)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1-base)</li><li>[project](https://0x3f3f3f3fun.github.io/projects/diffbir/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rGnrpxWjBOg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MIRiJGuGqsg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DiffBIR-colab/blob/main/DiffBIR_colab.ipynb) | 18.12.2023 |
  • Tomoki Hayashi - badge.php?doi=10.1109/ICASSP40776.2020.9053795)](https://doi.org/10.1109/ICASSP40776.2020.9053795) [![](https://img.shields.io/github/stars/kan-bayashi/ParallelWaveGAN?style=social)](https://github.com/kan-bayashi/ParallelWaveGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.11480), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.06711), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05106)</li><li>[demo](https://kan-bayashi.github.io/ParallelWaveGAN/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/tacotron2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/espnet/espnet)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/espnet/notebook/blob/master/espnet2_tts_realtime_demo.ipynb) | 13.12.2023 |
  • Haohe Liu - yuan)</li> <li>[Xinhao Mei](https://xinhaomei.github.io/)</li><details><summary>others</summary><li>[Xubo Liu](https://liuxubo717.github.io/)</li> <li>[Danilo Mandic](https://www.imperial.ac.uk/people/d.mandic)</li> <li>[Wenwu Wang](http://personal.ee.surrey.ac.uk/Personal/W.Wang/)</li> <li>[Mark Plumbley](https://www.surrey.ac.uk/people/mark-plumbley)</li></ul></details> | [![](https://img.shields.io/github/stars/haoheliu/AudioLDM?style=social)](https://github.com/haoheliu/AudioLDM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.12503)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/LAION-AI/CLAP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CompVis/stable-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/toshas/torch-fidelity)</li><li>[project](https://audioldm.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_0VTltNYhao)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/olaviinha/NeuralTextToAudio/blob/main/AudioLDM_pub.ipynb) | 02.12.2023 |
  • Noah Hollmann - freiburg.de/profile/hutter/)</li></ul> | [![](https://img.shields.io/github/stars/automl/TabPFN?style=social)](https://github.com/automl/TabPFN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.01848), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.11189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01342), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.03253), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.11189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10510)</li><li>[blog post](https://www.automl.org/tabpfn-a-transformer-that-solves-small-tabular-classification-problems-in-a-second/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/tunguz/status/1578730907711655937)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BGTO5N5-ack)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/194mCs6SEPEW6C0rcP7xWzcEtt1RBc8jJ) | 29.11.2023 |
  • Rohit Gandikota - sliders-lora-adaptors-for-precise-control-in-diffusion-models-b7f6b36fabee)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/49856ed476ad01fcff881d57e161d73f-Abstract.html)</li><li>[project](https://sliders.baulab.info/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/180zon7/concept_sliders_lora_adaptors_for_precise_control/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/rohitgandikota/sliders/blob/main/demo_concept_sliders.ipynb) | 26.11.2023 |
  • Jinze Bai - VL?style=social)](https://github.com/QwenLM/Qwen-VL) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.12966), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.09685), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.14314)</li><li>[demo](https://modelscope.cn/studios/qwen/Qwen-VL-Chat-Demo/summary)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/z3GAxXZ9Ce)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BradyFU/Awesome-Multimodal-Large-Language-Models/tree/Evaluation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OFA-Sys/TouchStone), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PanQiWei/AutoGPTQ)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/AILab-CVC/SEED-Bench_Leaderboard), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Qwen/Qwen-VL)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ElrSJDg23Po), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/E3MS8GfGWj4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ju09YaO7BGA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Qwen-VL-Chat-colab/blob/main/Qwen_VL_Chat_colab.ipynb) | 24.11.2023 |
  • Gang Liu - badge.php?doi=10.1587/transinf.2023EDP7061)](http://doi.org/10.1587/transinf.2023EDP7061) [![](https://img.shields.io/github/stars/TachibanaYoshino/AnimeGANv3?style=social)](https://github.com/TachibanaYoshino/AnimeGANv3) <ul><li>[project](https://tachibanayoshino.github.io/AnimeGANv3/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/EosubeJmAnE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5qLUflWb45E), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iFjiaPlhVm4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/vJqQQMRYKh0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0KaScDxgyBw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6WXhjXb5a-o)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1XYNWwM8Xq-U7KaTOqNap6A-Yq1f-V-FB) | 23.11.2023 |
  • Junsong Chen - alpha/PixArt-sigma?style=social)](https://github.com/PixArt-alpha/PixArt-sigma) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2403.04692), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.00426), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2401.05252)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/rde6eaE5Ta)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/PixArt-alpha/PixArt-alpha), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/PixArt-alpha/PixArt-LCM)</li><li>[project](https://pixart-alpha.github.io/PixArt-sigma-project/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/PixArtSigma/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1jZ5UZXk7tcpTfVwnX33dDuefNMcnW9ME) | 07.11.2023 |
  • Fitsum Reda - badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15) [![](https://img.shields.io/github/stars/google-research/frame-interpolation?style=social)](https://github.com/google-research/frame-interpolation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.04901)</li><li>[data](http://data.csail.mit.edu/tofu/testset/vimeo_interp_test.zip), [data](https://vision.middlebury.edu/flow/data), [data](https://people.cs.umass.edu/~hzjiang/projects/superslomo/UCF101_results.zip)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sniklaus/softmax-splatting/blob/master/benchmark.py)</li><li>[project](https://film-net.github.io/)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/tutorials/load_data/tfrecord), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/train/Example), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/saved_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OAD-BieIjH4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sK0uc-GJxmdnaxHhYqD2afRknakpdTNZ) | 02.11.2023 |
  • David Junhao Zhang - wei-liu.github.io/)</li> <li>[Rui Zhao](https://ruizhaocv.github.io/)</li><details><summary>others</summary><li>[Lingmin Ran](https://siacorplab.nus.edu.sg/people/ran-lingmin/)</li> <li>[Yuchao Gu](https://ycgu.site/)</li> <li>[Difei Gao](https://scholar.google.com/citations?user=No9OsocAAAAJ)</li> <li>[Mike Zheng Shou](https://sites.google.com/view/showlab/home)</li></ul></details> | [![](https://img.shields.io/github/stars/showlab/Show-1?style=social)](https://github.com/showlab/Show-1) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.15818)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-base), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-interpolation), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-sr1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-sr2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cerspense/zeroscope_v2_576w)</li><li>[project](https://showlab.github.io/Show-1/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Show-1-colab/blob/main/Show_1_steps_colab.ipynb) | 15.10.2023 |
  • Xubo Liu - yuan)</li> <li>[Yuzhuo Liu](https://github.com/redrabbit94)</li> <li>[Rui Xia](https://scholar.google.co.uk/citations?user=26oErxwAAAAJ)</li> <li>[Yuxuan Wang](https://scholar.google.com/citations?user=3RaOfJkAAAAJ)</li> <li>[Mark Plumbley](https://www.surrey.ac.uk/people/mark-plumbley)</li> <li>[Wenwu Wang](http://personal.ee.surrey.ac.uk/Personal/W.Wang/)</li></ul></details> | [![](https://img.shields.io/github/stars/Audio-AGI/AudioSep?style=social)](https://github.com/Audio-AGI/AudioSep) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.05037)</li><li>[project](https://audio-agi.github.io/Separate-Anything-You-Describe/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Audio-AGI/AudioSep/blob/main/AudioSep_Colab.ipynb) | 12.10.2023 |
  • Ziwei Luo - uir?style=social)](https://github.com/Algolzw/daclip-uir) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.01018)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Algolzw/image-restoration-sde)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/weblzw/daclip-uir-ViT-B-32-irsde)</li><li>[project](https://algolzw.github.io/daclip-uir/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/daclip-uir-colab/blob/main/daclip_uir_gradio_colab.ipynb) | 11.10.2023 |
  • Wenxuan Zhang - xjtu.github.io/)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li> <li>[Fei Wang](http://gr.xjtu.edu.cn/zh/web/feynmanw)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836) [![](https://img.shields.io/github/stars/OpenTalker/SadTalker?style=social)](https://github.com/OpenTalker/SadTalker) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.12194)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/rrayYqZ4tf)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhanglonghao1992/One-Shot_Free-View_Neural_Talking_Head_Synthesis), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RenYurui/PIRender), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Deep3DFaceReconstruction), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Zz-ww/SadTalker-Video-Lip-Sync), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenTalker/DPE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiiYin/SPI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mael-zys/T2M-GPT)</li><li>[project](https://sadtalker.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AoIzJWnQw1M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fDgQcDL-qOc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BkSnM9cxkcM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7u0FYVPQ5rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/OpenTalker/SadTalker/blob/main/quick_demo.ipynb) | 10.10.2023 |
  • Marco Pasini - cnn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CPJKU/madmom)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/marcop/musika)</li><li>[project](https://marcoppasini.github.io/musika)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QBl8y2Z_i7Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0l7OSM-bFvc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1PowSw3doBURwLE-OTCiWkO8HVbS5paRb) | 09.10.2023 |
  • Kaiheng Weng - object-detection/)</li><li>[data](https://cocodataset.org/#download)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://yolov6-docs.readthedocs.io/zh_CN/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiGeChuanShu/ncnn-android-yolov6), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DefTruth/lite.ai.toolkit/blob/main/lite/ort/cv/yolov6.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Linaom1214/TensorRT-For-YOLO-Series), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhiqwang/yolov5-rt-stack/tree/main/deployment/tensorrt-yolov6)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3OpwcGU7VvE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GJ0lVOE3a7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3hqkbqJ5ag8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fFCWrMFH2UY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/meituan/YOLOv6/blob/master/turtorial.ipynb) | 08.10.2023 |
  • Jiaxiang Tang - ren.github.io/)</li> <li>[Hang Zhou](https://hangz-nju-cuhk.github.io/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li> <li>[Gang Zeng](http://www.cis.pku.edu.cn/info/1177/1378.htm)</li></ul> | [![](https://img.shields.io/github/stars/dreamgaussian/dreamgaussian?style=social)](https://github.com/dreamgaussian/dreamgaussian) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.16653)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/graphdeco-inria/diff-gaussian-rasterization), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/nvdiffrast), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hoffstadt/DearPyGui)</li><li>[project](https://dreamgaussian.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sLpYmmLS209-e5eHgcuqdryFRRO6ZhFS) | 04.10.2023 |
  • Yuliang Xiu - distance-queries), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Project-Splinter/MonoPortDataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ZhengZerong/PaMIR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Project-Splinter/MonoPort), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/shunsukesaito/SCANimate), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/aistplusplus_api)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Yuliang/ICON)</li><li>[project](https://icon.is.tue.mpg.de/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hZd6AYin2DE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1-AWeWhPvCTBX0KfMtgtMk10uPU05ihoA) | 31.08.2023 |
  • Maxime Oquab - Nouby](https://aelnouby.github.io/)</li> <li>[Mahmoud Assran](http://www.midoassran.ca/)</li> <li>[Nicolas Ballas](https://scholar.google.com/citations?user=euUV4iUAAAAJ)</li> <li>[Wojciech Galuba](https://scholar.google.com/citations?user=jyaTX64AAAAJ)</li> <li>[Russell Howes](http://www.russellhowes.net/)</li> <li>[Po-Yao Huang](https://berniebear.github.io/)</li> <li>[Shang-Wen Li](https://swdanielli.github.io/)</li> <li>[Ishan Misra](http://imisra.github.io/)</li> <li>[Michael Rabbat](https://scholar.google.com/citations?user=cMPKe9UAAAAJ)</li> <li>[Vasu Sharma](https://vasusharma.github.io/)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Hu Xu](https://howardhsu.github.io/)</li> <li>[Hervé Jegou](https://github.com/jegou)</li> <li>[Julien Mairal](http://thoth.inrialpes.fr/people/mairal/)</li> <li>[Patrick Labatut](https://github.com/patricklabatut)</li> <li>[Armand Joulin](https://scholar.google.com/citations?user=kRJkDakAAAAJ)</li> <li>[Piotr Bojanowski](https://github.com/piotr-bojanowski)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/dinov2?style=social)](https://github.com/facebookresearch/dinov2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.07193)</li><li>[blog post](https://ai.facebook.com/blog/dino-v2-computer-vision-self-supervised-learning/)</li><li>[demo](https://dinov2.metademolab.com/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/model_doc/dinov2)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://purnasaigudikandula.medium.com/dinov2-image-classification-visualization-and-paper-review-745bee52c826), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/meta-ais-another-revolutionary-large-scale-model-dinov2-for-image-feature-extraction-1114b287eadd)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/csEgtSh7jV4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/KSZiJ4k28b4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RZEkdOc3szU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/dinov2/blob/main/notebooks/semantic_segmentation.ipynb) | 31.08.2023 |
  • Zhaoshuo Li - Yu Liu](https://mingyuliu.net/)</li> <li>[Chen-Hsuan Lin](https://chenhsuanlin.bitbucket.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/NVlabs/neuralangelo?style=social)](https://github.com/NVlabs/neuralangelo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.03092)</li><li>[blog post](https://blogs.nvidia.com/blog/2023/06/01/neuralangelo-ai-research-3d-reconstruction/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mli0603/BlenderNeuralangelo)</li><li>[project](https://research.nvidia.com/labs/dir/neuralangelo/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PQMNCXR-WF8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qpdw3SW54kI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/lC2uPDfaTcE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/13u8DX9BNzQwiyPPCB7_4DbSxiQ5-_nGF) | 27.08.2023 |
  • Alexander Groshev - kuznetsov-70ab12127)</li> <li>[Denis Dimitrov](https://github.com/denndimitrov)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ACCESS.2022.3196668)](https://doi.org/10.1109/ACCESS.2022.3196668) [![](https://img.shields.io/github/stars/ai-forever/ghost?style=social)](https://github.com/ai-forever/ghost) <ul><li>[blog post](https://habr.com/ru/company/sberbank/blog/645919/)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/vgg_face/)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/wawa9000/ghost)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1vXTpsENipTmjTMggwveCkXASwxUk270n) | 22.08.2023 |
  • Matthias Minderer - badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.06230)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/owlvit)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/zeroshot_object_detection_with_owlvit.ipynb) | 21.08.2023 |
  • Prajwal Renukanand - badge.php?doi=10.1145/3394171.3413532)](https://doi.org/10.1145/3394171.3413532) [![](https://img.shields.io/github/stars/Rudrabha/Wav2Lip?style=social)](https://github.com/Rudrabha/Wav2Lip) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.10010)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/lip_reading/lrs2.html)</li><li>[demo](http://bhaasha.iiit.ac.in/lipsync/)</li><li>[project](http://cvit.iiit.ac.in/research/projects/cvit-projects/a-lip-sync-expert-is-all-you-need-for-speech-to-lip-generation-in-the-wild/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=0fXaDCZNOJc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eyaler/avatars4all/blob/master/melaflefon.ipynb) | 19.08.2023 |
  • Tero Karras - Aittala)</li> <li>[Samuli Laine](https://research.nvidia.com/person/Samuli-Laine)</li> <li>[Erik Härkönen](https://github.com/harskish)</li><details><summary>others</summary><li>[Janne Hellsten](https://research.nvidia.com/person/Janne-Hellsten)</li> <li>[Jaakko Lehtinen](https://users.aalto.fi/~lehtinj7/)</li> <li>[Timo Aila](https://research.nvidia.com/person/timo-aila)</li></ul></details> | [![](https://img.shields.io/github/stars/NVlabs/stylegan3?style=social)](https://github.com/NVlabs/stylegan3) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.12423), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08500), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.01401), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.06991), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1812.04948), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.03498)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3-detector), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/metfaces-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/076ccd93ad68be51f23707988e934906-Abstract.html)</li><li>[project](https://nvlabs.github.io/stylegan3)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1BXNHZBai-pXtP-ncliouXo_kUiG1Pq7M) | 13.08.2023 |
  • Andrew Brock
  • Yang Zhou - shechtman/)</li> <li>[Jose Echevarria](http://www.jiechevarria.com/)</li><details><summary>others</summary><li>[Evangelos Kalogerakis](https://people.cs.umass.edu/~kalo/)</li> <li>[Dingzeyu Li](https://dingzeyu.li/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3414685.3417774)](https://doi.org/10.1145/3414685.3417774) [![](https://img.shields.io/github/stars/yzhou359/MakeItTalk?style=social)](https://github.com/yzhou359/MakeItTalk) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.12992)</li><li>[data](https://drive.google.com/drive/folders/1EwuAy3j1b9Zc1MsidUfxG_pJGc_cV60O)</li><li>[project](https://people.umass.edu/~yangzhou/MakeItTalk/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=vUMGKASgbf8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/iboyles/makeittalknow/blob/main/working_quick_demo_of_makeittalk_07_2023.ipynb) | 27.07.2023 |
  • Denis Korzhenkov - badge.php?doi=10.1109/CVPR42600.2020.00751)](https://doi.org/10.1109/CVPR42600.2020.00751) [![](https://img.shields.io/github/stars/saic-mdal/HiDT?style=social)](https://github.com/saic-mdal/HiDT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.08791)</li><li>[project](https://saic-mdal.github.io/HiDT/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLuvGzlEQXT1KQuKrfBBEWh2f3PToxyeM5), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=EWKAgwgqXB4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/saic-mdal/hidt/blob/master/notebooks/HighResolutionDaytimeTranslation.ipynb) | 24.07.2023 |
  • Xudong Wang
  • Xinyu Huang - nju)</li><details><summary>others</summary><li>[Yanchun Xie](https://scholar.google.com/citations?user=T0xk9-wAAAAJ)</li> <li>[Yuzhuo Qin](https://scholar.google.com/citations?user=5ZG65AkAAAAJ)</li> <li>[Tong Luo](https://ieeexplore.ieee.org/author/37089387319)</li> <li>[Yaqian Li](https://openreview.net/profile?id=~Yaqian_Li1)</li> <li>[Yandong Guo](http://www.lsl.zone/)</li> <li>[Yandong Guo](https://scholar.google.com/citations?user=fWDoWsQAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/xinyu1205/recognize-anything?style=social)](https://github.com/xinyu1205/recognize-anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.03514), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05657)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenGVLab/Ask-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/positive666/Prompt-Can-Anything)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://artgor.medium.com/paper-review-recognize-anything-a-strong-image-tagging-model-9e5e1c6dd0af)</li><li>[project](https://recognize-anything.github.io/), [project](https://recognize-anything.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mhd-medfa/recognize-anything/blob/main/recognize_anything_demo.ipynb) | 09.07.2023 |
  • Jian Zhao - nb/Thin-Plate-Spline-Motion-Model?style=social)](https://github.com/yoyo-nb/Thin-Plate-Spline-Motion-Model) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.14367)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/monkey-net), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/video-preprocessing), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/pose-evaluation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TalkUHulk/Image-Animation-Turbo-Boost)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/CVPR/Image-Animation-using-Thin-Plate-Spline-Motion-Model)</li><li>[supp](https://cloud.tsinghua.edu.cn/f/f7b8573bb5b04583949f/?dl=1)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1DREfdpnaBhqISg0fuQlAAIwyGVn1loH_) | 07.07.2023 |
  • Xingang Pan - inf.mpg.de/~tleimkue/)</li> <li>[Lingjie Liu](https://lingjie0206.github.io/)</li><details><summary>others</summary><li>[Abhimitra Meka](https://www.meka.page/)</li> <li>[Christian Theobalt](https://people.mpi-inf.mpg.de/~theobalt/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500) [![](https://img.shields.io/github/stars/XingangPan/DragGAN?style=social)](https://github.com/XingangPan/DragGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.10973)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3#requirements)</li><li>[project](https://vcai.mpi-inf.mpg.de/projects/DragGAN/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/XingangP)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1mey-IXPwQC_qSthI5hO-LTX7QL4ivtPh) | 03.07.2023 |
  • Xu Zhao - ding)</li> <li>[Yongqi An](https://github.com/an-yongqi)</li> <li>[Yinglong Du](https://github.com/YinglongDu)</li><details><summary>others</summary><li>[Tao Yu](https://github.com/tianjinren)</li> <li>[Min Li](https://github.com/limin2021)</li> <li>[Ming Tang](https://www.researchgate.net/profile/Ming-Tang-2)</li> <li>[Jinqiao Wang](https://scholar.google.com/citations?user=7_BkyxEAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/CASIA-IVA-Lab/FastSAM?style=social)](https://github.com/CASIA-IVA-Lab/FastSAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.12156), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10003)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ChuRuaNh0/FastSam_Awsome_TensorRT)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@mahimairaja/so-what-exactly-is-fastsam-the-ultimate-guide-ddae21d3b486)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yHNPyqazYYU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SslzS0AsiAw), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/qvqkjP1wCDE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1oX14f6IneGGw612WgVlAiy91UHwFAvr9) | 30.06.2023 |
  • Chaoning Zhang - Ho Bae](https://scholar.google.com/citations?user=EULut5oAAAAJ)</li> <li>[Seungkyu Lee](https://scholar.google.com/citations?user=3Pf6C6cAAAAJ)</li> <li>[Choong Seon Hong](https://scholar.google.com/citations?user=oKANWloAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/ChaoningZhang/MobileSAM?style=social)](https://github.com/ChaoningZhang/MobileSAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.14289)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jolibrain/joliGEN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/akbartus/MobileSAM-in-the-Browser), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/qiaoyu1002/Inpaint-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/qiaoyu1002/Personalize-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Jumpat/SegmentAnythingin3D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vietanhdev/anylabeling), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/wangsssky/SonarSAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/continue-revolution/sd-webui-segment-anything)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/_akhaliq/status/1674410573075718145)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eTEfq_kWabQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ChaoningZhang/MobileSAM/blob/master/notebooks/predictor_example.ipynb) | 30.06.2023 |
  • Shilong Liu - cv.github.io/)</li> <li>[Chunyuan Li](https://scholar.google.com/citations?user=Zd7WmXUAAAAJ)</li> <li>[Jianwei Yang](https://jwyang.github.io/)</li> <li>[Hang Su](https://www.suhangss.me/)</li> <li>[Jun Zhu](https://scholar.google.com/citations?user=axsP38wAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05499)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/DINO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Semantic-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OptimalScale/DetGPT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OpenSeeD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/X-Decoder/tree/xgpt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/detrex)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/zero-shot-object-detection-on-mscoco?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/zero-shot-object-detection-on-odinw?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/object-detection-on-coco-minival?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/object-detection-on-coco?p=grounding-dino-marrying-dino-with-grounded)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wxWDt5UiwY8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/cMa77r3YrDk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/C4NqaRBz_Kw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oEQYStnF2l8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb) | 28.06.2023 |
  • Adam Roberts - Thorp](https://scholar.google.com/citations?user=qsPv098AAAAJ)</li> <li>[Colin Raffel](https://colinraffel.com/)</li> <li>[Noam Shazeer](https://scholar.google.com/citations?user=wsGvgA8AAAAJ)</li> <li>[Marvin Ritter](https://scholar.google.com/citations?user=arcf5FgAAAAJ)</li> <li>[Maarten Bosma](https://scholar.google.com/citations?user=wkeFQPgAAAAJ)</li> <li>[Alexandre Passos](https://www.ic.unicamp.br/~tachard/)</li> <li>[Jeremy Maitin-Shepard](https://research.google/people/JeremyMaitinShepard/)</li> <li>[Noah Fiedel](https://scholar.google.com/citations?user=XWpV9DsAAAAJ)</li> <li>[Brennan Saeta](https://github.com/saeta)</li> <li>[Ryan Sepassi](https://ryansepassi.com/)</li> <li>[Alexander Spiridonov](https://research.google/people/AlexanderSpiridonov/)</li> <li>[Joshua Newlan](https://github.com/joshnewlan)</li> <li>[Andrea Gesmundo](https://github.com/agesmundo)</li></ul></details> | [![](https://img.shields.io/github/stars/google-research/t5x?style=social)](https://github.com/google-research/t5x) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.17189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.10683)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://t5x.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/mesh), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/serving)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/wmt_t2t_translate), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/data), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/tensorboard)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/t5x/blob/main/t5x/notebooks/introduction.ipynb) | 27.06.2023 |
  • Aliaksandr Siarohin - badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520) [![](https://img.shields.io/github/stars/AliaksandrSiarohin/motion-cosegmentation?style=social)](https://github.com/AliaksandrSiarohin/motion-cosegmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/2004.03234)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/video-preprocessing)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=RJ4Nj1wV5iA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/AliaksandrSiarohin/motion-cosegmentation/blob/master/part_swap.ipynb) | 07.04.2020 |
  • Vineel Pratap - kundu)</li> <li>[Ali Elkahky](https://scholar.google.com/citations?user=KB3S8RoAAAAJ)</li> <li>[Zhaoheng Ni](https://scholar.google.com/citations?user=SYFMSNsAAAAJ)</li> <li>[Apoorv Vyas](https://apoorv2904.github.io/)</li> <li>[Maryam Fazel-Zarandi](https://www.maryamfazel.com/)</li> <li>[Alexei Baevski](https://github.com/alexeib)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Xiaohui Zhang](https://github.com/xiaohui-zhang)</li> <li>[Wei-Ning Hsu](https://wnhsu.github.io/)</li> <li>[Alexis Conneau](https://github.com/aconneau)</li> <li>[Michael Auli](https://github.com/michaelauli)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq/tree/main/examples/mms) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13516)</li><li>[blog post](https://ai.facebook.com/blog/multilingual-model-speech-recognition/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/model_doc/mms), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/mms-cclms/), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/mms_adapters)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GEzxHxWys2s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g06agCmxS7I)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/fairseq/blob/main/examples/mms/asr/tutorial/MMS_ASR_Inference_Colab.ipynb) | 26.05.2023 |
  • chervonij - badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628) [![](https://img.shields.io/github/stars/iperov/DeepFaceLab?style=social)](https://github.com/iperov/DeepFaceLab) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05535)</li><li>[guide](https://mrdeepfakes.com/forums/thread-guide-deepfacelab-google-colab-tutorial)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCTKBl8kB6DJ_qLnk1NGDGbQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/chervonij/DFL-Colab/blob/master/DFL_Colab.ipynb) | 30.04.2023 |
  • Laurence Midgley - fHPgAAAAJ)</li> <li>[José Miguel Hernández-Lobato](https://jmhl.org/)</li></ul> | [![](https://img.shields.io/github/stars/lollcat/fab-torch?style=social)](https://github.com/lollcat/fab-torch) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.01893)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lollcat/fab-jax-old), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/annealed_flow_transport)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xQQXvOWu9nE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/lollcat/fab-torch/blob/master/experiments/gmm/fab_gmm.ipynb) | 29.04.2023 |
  • Shangchen Zhou - chongyi.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://img.shields.io/github/stars/sczhou/CodeFormer?style=social)](https://github.com/sczhou/CodeFormer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.11253)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/samb-t/unleashing-transformers), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepcam-cn/yolov5-face), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2022/hash/c573258c38d0a3919d8c1364053c45df-Abstract-Conference.html)</li><li>[project](https://shangchenzhou.com/projects/CodeFormer/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/d3VDpkXlueI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PtwWu-FugbA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ORtYP8NW4T0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xc5lKOKBCcg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1m52PNveE4PBhYrecj34cnpEeiHcC5LTb) | 21.04.2023 |
  • Levon Khachatryan - hen)</li><details><summary>others</summary><li>[Zhangyang Wang](https://www.ece.utexas.edu/people/faculty/atlas-wang)</li> <li>[Shant Navasardyan](https://scholar.google.com/citations?user=VJSh59sAAAAJ)</li> <li>[Humphrey Shi](https://www.humphreyshi.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/Picsart-AI-Research/Text2Video-Zero?style=social)](https://github.com/Picsart-AI-Research/Text2Video-Zero) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.13439), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.01341), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.17604)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/dbolya/tomesd), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/JiauZhang/Text2Video-Zero), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/camenduru/text2video-zero-colab), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SHI-Labs/Text2Video-Zero-sd-webui)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/text_to_video_zero)</li><li>[project](https://text2video-zero.github.io/)</li><li>[video](https://www.dropbox.com/s/uv90mi2z598olsq/Text2Video-Zero.MP4)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/beeDJJz-Q0A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/97-1GYPtz0M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/text2video-zero-colab/blob/main/text2video_all.ipynb) | 11.04.2023 |
  • Alexander Kirillov - rolland-223135a/)</li> <li>[Laura Gustafson](https://scholar.google.com/citations?user=c8IpF9gAAAAJ)</li> <li>[Tete Xiao](https://tetexiao.com/)</li> <li>[Spencer Whitehead](https://www.spencerwhitehead.com/)</li> <li>[Alex Berg](http://acberg.com/)</li> <li>[Wan-Yen Lo](https://github.com/wanyenlo)</li> <li>[Piotr Dollar](https://pdollar.github.io/)</li> <li>[Ross Girshick](https://www.rossgirshick.info/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/segment-anything?style=social)](https://github.com/facebookresearch/segment-anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.02643)</li><li>[blog post](https://ai.facebook.com/research/publications/segment-anything/), [blog post](https://ai.facebook.com/blog/segment-anything-foundation-model-image-segmentation/)</li><li>[data](https://ai.facebook.com/datasets/segment-anything/)</li><li>[website](https://segment-anything.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2O_vecl28OA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fVeW9a6wItM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FjYE0tKWOiY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/segment-anything/blob/main/notebooks/predictor_example.ipynb) | 10.04.2023 |
  • Fangzhou Hong - kyvzTQrBI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hongfz16/EVA3D/blob/main/notebook/EVA3D_Demo.ipynb) | 06.04.2023 |
  • Jiaxiang Tang - dreamfusion?style=social)](https://github.com/ashawkey/stable-dreamfusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.14988)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ashawkey/torch-ngp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hoffstadt/DearPyGui)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/runwayml/stable-diffusion-v1-5)</li><li>[project](https://dreamfusion3d.github.io/)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/docs/stable/cpp_extension.html#torch.utils.cpp_extension.load)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uM5NPodZZ1U?t=219), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zWD5ZR5GtJM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L3G0dx1Q0R8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dIgDbBTztUM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1MXT3yfOFvO0ooKEfiUUvTKwUkrrlCHpF) | 04.04.2023 |
  • Kunchang Li - X/UniFormer?style=social)](https://github.com/Sense-X/UniFormer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.04676), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.09450), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.10858), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17239)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/zihangJiang/TokenLabeling), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/deit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fvcore), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookincubator/submitit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SlowFast), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Swin-Transformer-Object-Detection), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/whai362/PVT/tree/v2/segmentation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HRNet/HRFormer/tree/main/pose)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Sense-X/uniformer_image_demo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Sense-X/uniformer_video_demo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Andy1621/uniformer_image_detection), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Andy1621/uniformer_image_segmentation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmsegmentation/blob/master/demo/MMSegmentation_Tutorial.ipynb) | 31.03.2023 |
  • Shunsuke Saito - IvjMAAAAJ)</li> <li>[Hanbyul Joo](https://jhugestar.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00016)](https://doi.org/10.1109/CVPR42600.2020.00016) [![](https://img.shields.io/github/stars/facebookresearch/pifuhd?style=social)](https://github.com/facebookresearch/pifuhd) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.00452)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uEDqCxvF5yc), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=8qnwbbDS8xk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11z58bl3meSzo6kFqkahMa35G5jmh2Wgt) | 26.03.2023 |
  • Kun Cheng - badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399) [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.14758)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/donydchen/ganimation_replicate), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RenYurui/PIRender), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenTalker/StyleHEAT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiiYin/SPI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mael-zys/T2M-GPT)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://xthemadgenius.medium.com/making-videos-talk-right-syncing-lips-with-sound-using-videoretalking-611428084bbc)</li><li>[project](https://opentalker.github.io/video-retalking/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/178krha/videoretalking/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/pttsTrQ-fko), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2Lkw8AmmRn0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RJ8YK_K4Ne0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vinthony/video-retalking/blob/main/quick_demo.ipynb) | 19.03.2023 |
  • Chenfei Wu - yin)</li> <li>[Weizhen Qi](https://github.com/WeizhenQ)</li> <li>[Xiaodong Wang](https://wang-xiaodong1899.github.io/)</li><details><summary>others</summary><li>[Zecheng Tang](https://github.com/CODINNLG)</li> <li>[Nan Duan](https://nanduan.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/visual-chatgpt?style=social)](https://github.com/microsoft/visual-chatgpt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.04671)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/hwchase17/langchain), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lllyasviel/ControlNet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timothybrooks/instruct-pix2pix), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timojl/clipseg)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0UfXlFUwLms), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7YEiEyfPF5U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11BtP3h-w0dZjA-X8JsS9_eo8OeGYvxXB) | 15.03.2023 |
  • Jay Zhangjie Wu - F69UAAAAJ)</li> <li>[Mike Zheng Shou](https://sites.google.com/view/showlab)</li></ul></details> | [![](https://img.shields.io/github/stars/showlab/Tune-A-Video?style=social)](https://github.com/showlab/Tune-A-Video) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.11565), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10752)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Tune-A-Video-library), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sd-dreambooth-library)</li><li>[project](https://tuneavideo.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uzF6CTtjn-g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uUlp1_ExsGQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/showlab/Tune-A-Video/blob/main/notebooks/Tune-A-Video.ipynb) | 23.02.2023 |
  • Roman Suvorov - %D1%81%D0%B8%D0%BB%D1%8C%D0%B2%D0%B5%D1%81%D1%82%D1%80%D0%BE%D0%B2-141b99b6/)</li> <li>[Naejin Kong](https://github.com/naejin-kong)</li> <li>[Harshith Goka](https://github.com/h9399-goka)</li> <li>[Kiwoong Park](https://github.com/kyoong-park)</li> <li>[Victor Lempitsky](http://sites.skoltech.ru/compvision/members/vilem/)</li></ul></details> | [![](https://img.shields.io/github/stars/saic-mdal/lama?style=social)](https://github.com/saic-mdal/lama) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.07161)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/andy971022/auto-lama), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/richzhang/PerceptualSimilarity), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Po-Hsun-Su/pytorch-ssim), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mseitzer/pytorch-fid)</li><li>[project](https://saic-mdal.github.io/lama-project/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/saic-mdal/lama/blob/master/colab/LaMa_inpainting.ipynb) | 15.02.2023 |
  • Tao Yang - pytorch)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yangxy/GPEN/blob/main/GPEN.ipynb) | 15.02.2023 |
  • Max Ingham - diffusion?style=social)](https://github.com/alembics/disco-diffusion) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/guided-diffusion)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_DtWfh9oS54), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gWxmtdZL8FE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yVJB6oD0_gM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/alembics/disco-diffusion/blob/main/Disco_Diffusion.ipynb) | 11.02.2023 |
  • Fabian-Robert Stöter - badge.php?doi=10.21105/joss.01667)](https://doi.org/10.21105/joss.01667) [![](https://img.shields.io/github/stars/sigsep/open-unmix-pytorch?style=social)](https://github.com/sigsep/open-unmix-pytorch) <ul><li>[data](https://sigsep.github.io/datasets/musdb.html#musdb18-compressed-stems)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sigsep/norbert)</li><li>[project](https://sigsep.github.io/open-unmix/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/music-source-separation-on-musdb18?p=open-unmix-a-reference-implementation-for)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLhA3b2k8R3t0VpYCpCTU2B1h604rvnV4N)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1mijF0zGWxN-KaxTnd0q6hayAlrID5fEQ) | 09.02.2023 |
  • Jon Gillick - datasets)</li><li>[web app](https://groove-drums.glitch.me/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=x2YLmXzovDo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/magenta-demos/blob/master/colab-notebooks/GrooVAE.ipynb) | 01.02.2023 |
  • Ian Simon - demos/blob/master/colab-notebooks/Multitrack_MusicVAE.ipynb) | 01.02.2023 |
  • Adam Roberts - vae)</li><li>[project](https://magenta.tensorflow.org/music-vae)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLBUMAYA6kvGU8Cgqh709o5SUvo-zHGTxr)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/MusicVAE.ipynb) | 01.02.2023 |
  • Manuel Romero - badge.php?doi=10.1109/ICCV.2019.00880)](https://doi.org/10.1109/ICCV.2019.00880) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1903.04411)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/reinforcementlearning/comments/b5lpfl/learning_to_paint_with_modelbased_deep/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=YmOgKZ5oipk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mrm8488/shared_colab_notebooks/blob/master/custom_learningtopaint.ipynb) | 01.02.2023 |
  • Chengyi Wang - chen.github.io/)</li> <li>[Yu Wu](https://www.microsoft.com/en-us/research/people/yuwu1/)</li> <li>[Ziqiang Zhang](https://github.com/zz12375)</li><details><summary>others</summary><li>[Long Zhou](https://long-zhou.github.io/)</li> <li>[Shujie Liu](https://www.microsoft.com/en-us/research/people/shujliu/)</li> <li>[Zhuo Chen](https://www.microsoft.com/en-us/research/people/zhuc/)</li> <li>[Yanqing Liu](https://scholar.google.com/citations?user=dIJFz4UAAAAJ)</li> <li>[Huaming Wang](https://scholar.google.com/citations?user=aJDLg5IAAAAJ)</li> <li>[Jinyu Li](https://www.microsoft.com/en-us/research/people/jinyli/)</li> <li>[Lei He](https://scholar.google.com/citations?user=EKl9yY8AAAAJ)</li> <li>[Sheng Zhao](https://scholar.google.com/citations?user=689bIIwAAAAJ)</li> <li>[Furu Wei](https://www.microsoft.com/en-us/research/people/fuwei/)</li></ul></details> | [![](https://img.shields.io/github/stars/enhuiz/vall-e?style=social)](https://github.com/enhuiz/vall-e) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed#requirements)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://vidrihmarko.medium.com/mind-blowing-vall-e-neural-codec-language-models-are-zero-shot-text-to-speech-synthesizers-f002560ecd6)</li><li>[project](https://valle-demo.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/104ixvi/r_neural_codec_language_models_are_zeroshot_text/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/F6HSsVIkqIU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZehhrrQGmt4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-3MPZxRxvV4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ha2WjP7zfno)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1wEze0kQ0gt9B3bQmmbtbSXCoCTpq5vg-) | 18.01.2023 |
  • Thomas Müller - evans)</li> <li>[Christoph Schied](https://research.nvidia.com/person/christoph-schied)</li> <li>[Alexander Keller](https://research.nvidia.com/person/alex-keller)</li></ul> | [![](https://img.shields.io/github/stars/NVlabs/instant-ngp?style=social)](https://github.com/NVlabs/instant-ngp) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.05989)</li><li>[blog post](https://developer.nvidia.com/blog/getting-started-with-nvidia-instant-nerfs/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/tiny-cuda-nn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDLabMedia/large-lightfields-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nickponline/dd-nerf-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ocornut/imgui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nothings/stb)</li><li>[project](https://nvlabs.github.io/instant-ngp/)</li><li>[tutorial](https://www.nvidia.com/en-us/on-demand/session/siggraph2022-sigg22-s-16/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/j8tMk-GE8hY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8GbENSmdVeE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/DJ2hcC1orc4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/z3-fjYzd0BA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/NVlabs/instant-ngp/blob/master/notebooks/instant_ngp.ipynb) | 18.01.2023 |
  • Matthew Tancik - Keil](https://people.eecs.berkeley.edu/~sfk/)</li><details><summary>others</summary><li>[Nithin Raghavan](https://cseweb.ucsd.edu//~n2raghavan/)</li> <li>[Utkarsh Singhal](https://scholar.google.com/citations?user=lvA86MYAAAAJ)</li> <li>[Ravi Ramamoorthi](https://cseweb.ucsd.edu//~ravir/)</li> <li>[Jon Barron](https://jonbarron.info/)</li> <li>[Ren Ng](https://www2.eecs.berkeley.edu/Faculty/Homepages/yirenng.html)</li></ul></details> | [![](https://img.shields.io/github/stars/tancik/fourier-feature-networks?style=social)](https://github.com/tancik/fourier-feature-networks) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1806.07572)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/55053683268957697aa39fba6f231c68-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2007/hash/013a006f03dbc5392effeb8f18fda755-Abstract.html)</li><li>[project](https://bmild.github.io/fourfeat/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nVA6K6Sn2S4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tancik/fourier-feature-networks/blob/master/Demo.ipynb) | 17.01.2023 |
  • Hao-Shu Fang - hy)</li> <li>[Chao Xu](https://www.isdas.cn/)</li><details><summary>others</summary><li>[Haoyi Zhu](https://www.haoyizhu.site/)</li> <li>[Yuliang Xiu](https://xiuyuliang.cn/)</li> <li>[Yong-Lu Li](https://dirtyharrylyl.github.io/)</li> <li>[Cewu Lu](https://scholar.google.com/citations?user=QZVQEWAAAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784) [![](https://img.shields.io/github/stars/MVIG-SJTU/AlphaPose?style=social)](https://github.com/MVIG-SJTU/AlphaPose) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.03375)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tycoer/AlphaPose_jittor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fang-Haoshu/Halpe-FullBody)</li><li>[project](https://www.mvig.org/research/alphapose.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uze6chg-YeU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Z2WPd59pRi8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qW4lb9tnA3I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_qtNzylm1XI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_3Wxi4H3QGVC28snL3rHIoeMAwI2otMR) | 07.01.2023 |
  • Jiefeng Li - sjtu/HybrIK?style=social)](https://github.com/Jeff-sjtu/HybrIK) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.14672)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mks0601/3DMPPE_POSENET_RELEASE)</li><li>[project](https://jeffli.site/HybrIK/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/3d-human-pose-estimation-on-3dpw?p=hybrik-a-hybrid-analytical-neural-inverse)</li><li>[supp](https://openaccess.thecvf.com/content/CVPR2021/supplemental/Li_HybrIK_A_Hybrid_CVPR_2021_supplemental.zip)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tvwnXXH7xIw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1n41l7I2NxWseuruVQEU8he2XqzSXhu2f) | 01.01.2023 |
  • Alexandre Défossez - Gui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kuielab/mdx-net-submission), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/f90/Wave-U-Net)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dC9nVxk3V_VPjUADsnFu8EiT-xnU1tGH) | 21.11.2022 |
  • Mingyuan Zhang - zhang/MotionDiffuse?style=social)](https://github.com/mingyuan-zhang/MotionDiffuse) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.15001)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/mingyuan/MotionDiffuse)</li><li>[project](https://mingyuan-zhang.github.io/projects/MotionDiffuse.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U5PTnw490SA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Dp6VsZp2ozKuu9ccMmsDjyij_vXfCYb3) | 13.10.2022 |
  • Shuai Yang - jiang.com/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555437)](https://doi.org/10.1145/3550454.3555437) [![](https://img.shields.io/github/stars/williamyang1991/VToonify?style=social)](https://github.com/williamyang1991/VToonify) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.11224), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2001.02890)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zllrunning/face-parsing.PyTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhujiapeng/LowRankGAN)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/PKUWilliamYang/VToonify), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/PKUWilliamYang/VToonify/tree/main/models)</li><li>[project](https://www.mmlab-ntu.com/project/vtoonify/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0_OmVhDgYuY)</li></ul> | [![Open In Colab](images/colab.svg)](http://colab.research.google.com/github/williamyang1991/VToonify/blob/master/notebooks/inference_playground.ipynb) | 07.10.2022 |
  • Hongwen Zhang - DensePose2SMPL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/DensePose), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Microsoft/human-pose-estimation.pytorch)</li><li>[project](https://www.liuyebin.com/pymaf-x/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yqEmznSKjYI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ylOB0wCeV34)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11RXLsH9BdoSCwY6G-IX7KgqDxVoImu6K) | 06.10.2022 |
  • Alhussein Fawzi - Paredes](https://sites.google.com/site/romeraparedes/)</li> <li>[Mohammadamin Barekatain](http://barekatain.me/)</li> <li>[Alexander Novikov](https://scholar.google.com/citations?user=jMUkLqwAAAAJ)</li> <li>[Francisco Ruiz](https://franrruiz.github.io/)</li> <li>[Julian Schrittwieser](https://www.furidamu.org/)</li> <li>[Grzegorz Swirszcz](https://sites.google.com/site/grzegorzswirszcz/home)</li> <li>[David Silver](https://www.davidsilver.uk/)</li> <li>[Demis Hassabis](https://en.wikipedia.org/wiki/Demis_Hassabis)</li> <li>[Pushmeet Kohli](https://sites.google.com/site/pushmeet/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4) [![](https://img.shields.io/github/stars/deepmind/alphatensor?style=social)](https://github.com/deepmind/alphatensor) <ul><li>[blog post](https://www.deepmind.com/blog/discovering-novel-algorithms-with-alphatensor)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3N3Bl5AA5QU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gpYnDls4PdQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IYgZS2EvnLI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8ILk4Wjo5rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/alphatensor/blob/master/nonequivalence/inspect_factorizations_notebook.ipynb) | 04.10.2022 |
  • Marcos Conde - Jin Choi](https://github.com/Choiuijin1125)</li> <li>[Maxime Burchi](https://scholar.google.com/citations?user=7S_l2eAAAAAJ)</li> <li>[Radu Timofte](https://www.informatik.uni-wuerzburg.de/computervision/home/)</li></ul> | [![](https://img.shields.io/github/stars/mv-lab/swin2sr?style=social)](https://github.com/mv-lab/swin2sr) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.11345), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.10257), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.11184), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09883)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/KAIR/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mv-lab/AISP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Swin-Transformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/jjourney1125/swin2sr)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/jesucristo/super-resolution-demo-swin2sr-official/), [<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/datasets/jesucristo/super-resolution-benchmarks), [<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/jinssaa/official-swin2sr-demo-results/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1paPrt62ydwLv2U2eZqfcFsePI4X4WRR1) | 03.10.2022 |
  • Emilien Dupont - nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/jaxline)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/celeb_a_hq)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/functa/blob/main/modulation_visualization_colab.ipynb) | 24.09.2022 |
  • Alec Radford - python)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OCBZtgQGt1I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8SQV-B83tPU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nE5iVtwKerA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/whisper/blob/master/notebooks/LibriSpeech.ipynb) | 21.09.2022 |
  • Jason Antic - ai-research-lab/stabilizing-neural-style-transfer-for-video-62675e203e42)</li><li>[model](https://data.deepai.org/deoldify/ColorizeVideo_gen.pth)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/Nickelodeons/), [<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/silentmoviegifs/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/DeOldify)</li><li>[website](https://deoldify.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](http://www.youtube.com/watch?v=l3UXXid04Ys), [<img src="images/yt.svg" alt="yt" height=20/>](http://www.youtube.com/watch?v=EXn-n2iqEjI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jantic/DeOldify/blob/master/VideoColorizerColab.ipynb) | 19.09.2022 |
  • Jason Antic - robinson)</li> <li>[María Benavente](https://github.com/mariabg)</li></ul> | [![](https://img.shields.io/github/stars/jantic/DeOldify?style=social)](https://github.com/jantic/DeOldify) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1805.08318), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08500)</li><li>[model](https://data.deepai.org/deoldify/ColorizeArtistic_gen.pth)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/TheWayWeWere/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/DeOldify)</li><li>[website](https://deoldify.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jantic/DeOldify/blob/master/ImageColorizerColab.ipynb) | 19.09.2022 |
  • Xintao Wang - ESRGAN?style=social)](https://github.com/xinntao/Real-ESRGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.10833)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/ESRGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyView), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Tencent/ncnn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nihui/waifu2x-ncnn-vulkan)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1k2Zod6kSHEvraybHl50Lys0LerhyTMCo) | 18.09.2022 |
  • Jingxiang Sun - 3D?style=social)](https://github.com/MrTornado24/IDE-3D) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://arxiv.org/abs/2205.15517), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/eg3d), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Kj5XY_J2Alk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/MrTornado24/IDE-3D/blob/main/inversion/notebooks/inference_playground.ipynb) | 08.09.2022 |
  • Lili Chen - grover.github.io/)</li> <li>[Michael Laskin](https://www.mishalaskin.com/)</li> <li>[Pieter Abbeel](http://people.eecs.berkeley.edu/~pabbeel/)</li> <li>[Aravind Srinivas](https://github.com/aravindsrinivas)</li> <li>[Igor Mordatch](https://scholar.google.com/citations?user=Vzr1RukAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/kzl/decision-transformer?style=social)](https://github.com/kzl/decision-transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01345)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?other=gym-continous-control), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/edbeeching/decision-transformer-gym-hopper-expert), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/decision_transformer)</li><li>[project](https://sites.google.com/berkeley.edu/decision-transformer)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Autoregressive_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/k08N5a0gG0A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-buULmf7dec), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/83QN9S-0I84), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w4Bw8WYL8Ps)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1K3UuajwoPY1MzRKNkONNRS3gS5DxZ-qF) | 06.09.2022 |
  • Ajay Jain - badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094) [![](https://img.shields.io/github/stars/google-research/google-research?style=social)](https://github.com/google-research/google-research/tree/master/dreamfields) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01455), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.00677), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.13415)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ajayjain/DietNeRF), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/mipnerf)</li><li>[project](https://ajayj.com/dreamfields)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1Fke6w46tv4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1TjCWS2_Q0HJKdi9wA2OSY7avmFUQYGje) | 05.09.2022 |
  • William Peebles - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Richard Zhang](https://richzhang.github.io/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li><details><summary>others</summary><li>[Alexei Efros](https://people.eecs.berkeley.edu/~efros/)</li> <li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li></ul></details> | [![](https://img.shields.io/github/stars/wpeebles/gangealing?style=social)](https://github.com/wpeebles/gangealing) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.05143)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/nileshkulkarni/acsm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://jitengmu.github.io/CoordGAN/)</li><li>[project](https://www.wpeebles.com/gangealing)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qa1ASS_NuzE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qtOkktTNs-k)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JkUjhTjR8MyLxwarJjqnh836BICfocTu) | 01.09.2022 |
  • Rinon Gal - alaluf.github.io/)</li> <li>[Yuval Atzmon](https://research.nvidia.com/person/yuval-atzmon)</li> <li>[Or Patashnik](https://orpatashnik.github.io/)</li><details><summary>others</summary><li>[Amit Bermano](https://www.cs.tau.ac.il/~amberman/)</li> <li>[Gal Chechik](https://research.nvidia.com/person/gal-chechik)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/rinongal/textual_inversion?style=social)](https://github.com/rinongal/textual_inversion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.01618)</li><li>[project](https://textual-inversion.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/f3oXa7_SYek), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/opD_H9bED9Y)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/rinongal/textual_inversion/blob/master/scripts/latent_imagenet_diffusion.ipynb) | 21.08.2022 |
  • Jianglin Fu - Yee Lin](https://kwanyeelin.github.io/)</li><details><summary>others</summary><li>[Chen Qian](https://scholar.google.com/citations?user=AerkT0YAAAAJ)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li> <li>[Wayne Wu](https://wywu.github.io/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1) [![](https://img.shields.io/github/stars/stylegan-human/stylegan-human?style=social)](https://github.com/stylegan-human/stylegan-human) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.11823)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3)</li><li>[project](https://stylegan-human.github.io/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/market-1501)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nIrb9hwsdcI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/86b49sCz0Gg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g3nmM6MdxwY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/p2uwqh_SFL8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sgxoDM55iM07FS54vz9ALg1XckiYA2On) | 19.08.2022 |
  • Oran Gafni - A-Scene?style=social)](https://github.com/CasualGANPapers/Make-A-Scene) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13131)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZM06MjPdoxw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1SPyQ-epTsAOAu8BEohUokN4-b5RM_TnE) | 12.08.2022 |
  • Rinon Gal - chechik)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530164)](https://doi.org/10.1145/3528223.3530164) [![](https://img.shields.io/github/stars/rinongal/StyleGAN-nada?style=social)](https://github.com/rinongal/StyleGAN-nada) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.00946), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17249), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.02699)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada)</li><li>[project](https://stylegan-nada.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/rinongal/stylegan-nada/blob/main/stylegan_nada.ipynb) | 09.08.2022 |
  • Chien-Yao Wang - segments.zip)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/WongKinYiu/yolor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/WongKinYiu/PyTorch_YOLOv4), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/WongKinYiu/ScaledYOLOv4), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Megvii-BaseDetection/YOLOX), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DingXiaoH/RepVGG), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/JUGGHM/OREPA_CVPR2022), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/real-time-object-detection-on-coco?p=yolov7-trainable-bag-of-freebies-sets-new)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL_Nji0JOuXg2QMohGK7wfzgJ-MavzXRHW), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-QWxJ0j9EY8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/compare_YOLOv7_vs_YOLOv5m6_half.ipynb) | 09.08.2022 |
  • Liunian Harold Li - zhang.github.io/)</li> <li>[Jianwei Yang](https://jwyang.github.io/)</li><details><summary>others</summary><li>[Chunyuan Li](https://chunyuan.li/)</li> <li>[Yiwu Zhong](https://pages.cs.wisc.edu/~yiwuzhong/)</li> <li>[Lijuan Wang](https://github.com/LijuanWang)</li> <li>[Lu Yuan](https://scholar.google.com/citations?user=k9TsUVsAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li> <li>[Jenq-Neng Hwang](https://people.ece.uw.edu/hwang/)</li> <li>[Kai-Wei Chang](http://web.cs.ucla.edu/~kwchang/)</li> <li>[Jianfeng Gao](https://www.microsoft.com/en-us/research/people/jfgao/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069) [![](https://img.shields.io/github/stars/microsoft/GLIP?style=social)](https://github.com/microsoft/GLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03857), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.05836), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.01066), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.08790)</li><li>[blog post](https://www.microsoft.com/en-us/research/project/project-florence-vl/articles/object-detection-in-the-wild-via-grounded-language-image-pre-training/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/gligen/GLIGEN)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/harold/GLIP)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://sh-tsang.medium.com/glip-grounded-language-image-pre-training-2be2483295b3), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/glip-introducing-language-image-pre-training-to-object-detection-5ddb601873aa)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zu1BGQBI4dU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12x7v-_miN7-SRiziK3Cx4ffJzstBJNqb) | 30.07.2022 |
  • Ji Lin - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01474)](https://doi.org/10.1109/CVPR46437.2021.01474) [![](https://img.shields.io/github/stars/mit-han-lab/anycost-gan?style=social)](https://github.com/mit-han-lab/anycost-gan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.03243)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/switchablenorms/CelebAMask-HQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fyu/lsun)</li><li>[project](https://hanlab.mit.edu/projects/anycost-gan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=_yEziPl9AkM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mit-han-lab/anycost-gan/blob/master/notebooks/intro_colab.ipynb) | 20.07.2022 |
  • Xintao Wang - li.github.io/)</li> <li>[Honglun Zhang](https://scholar.google.com/citations?user=KjQLROoAAAAJ)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/TencentARC/GFPGAN?style=social)](https://github.com/TencentARC/GFPGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.04061)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyView), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset)</li><li>[project](https://xinntao.github.io/projects/gfpgan)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sVsoBd9AjckIXThgtZhGrHRfFI6UUYOo) | 13.07.2022 |
  • Hansheng Chen - QIwAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/tjiiv-cprg/EPro-PnP?style=social)](https://github.com/tjiiv-cprg/EPro-PnP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13254)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/megvii-research/petr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HuangJunJie2017/BEVDet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fudan-zvg/PolarFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhiqi-li/BEVFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmdetection3d)</li><li>[nuScenes](https://www.nuscenes.org/object-detection?externalData=no&mapData=no&modalities=Camera)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TonBodQ6EUU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tjiiv-cprg/EPro-PnP/blob/main/demo/fit_identity.ipynb) | 12.07.2022 |
  • Shuyang Gu - us/research/people/fangwen/)</li><details><summary>others</summary><li>[Bo Zhang](https://bo-zhang.me/)</li> <li>[Dongdong Chen](http://www.dongdongchen.bid/)</li> <li>[Lu Yuan](https://scholar.google.com/citations?&user=k9TsUVsAAAAJ)</li> <li>[Baining Guo](https://scholar.google.com/citations?user=h4kYmRYAAAAJ)</li> <li>[Shuyang Gu](https://github.com/cientgu)</li> <li>[Zhicong Tang](https://github.com/zzctan)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/VQ-Diffusion?style=social)](https://github.com/microsoft/VQ-Diffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.14822), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.16007)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ehoogeboom/multinomial_diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/improved-diffusion)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Ws0_wK2cnsWEnfB7HtmPT4bjCPElb40C) | 30.06.2022 |
  • Susan Zhang - y6SIhQAAAAJ)</li> <li>[Xi Victoria Lin](http://victorialin.net/)</li> <li>[Todor Mihaylov](https://github.com/tbmihailov)</li> <li>[Myle Ott](https://myleott.com/)</li> <li>[Sam Shleifer](https://github.com/sshleifer)</li> <li>[Kurt Shuster](https://github.com/klshuster)</li> <li>[Daniel Simig](https://scholar.google.com/citations?user=TtWU9fsAAAAJ)</li> <li>[Punit Singh Koura](https://github.com/punitkoura)</li> <li>[Anjali Sridhar](https://www.linkedin.com/in/anjalisridhar/)</li> <li>[Tianlu Wang](https://tianlu-wang.github.io/)</li> <li>[Luke Zettlemoyer](https://www.cs.washington.edu/people/faculty/lsz/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/metaseq?style=social)](https://github.com/facebookresearch/metaseq/tree/main/projects/OPT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.01068), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.02243), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.10350), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.11990)</li><li>[blog post](https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/Megatron-LM)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ejg0OunCi9U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14wnxMvD9zsiBQo2FtTpxn6w2cpXCcb-7) | 29.06.2022 |
  • Chen Chen
  • Adam Botach - Swin-Transformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/MTTR/MTTR-Referring-Video-Object-Segmentation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YqlhXgq6hcs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12p0jpSx3pJNfZk-y_L44yeHZlhsKVra-) | 20.06.2022 |
  • Jingyun Liang - wuerzburg.de/computervision/home/)</li></ul></details> | [![](https://img.shields.io/github/stars/JingyunLiang/SwinIR?style=social)](https://github.com/JingyunLiang/SwinIR) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.10257), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.10833)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/BSRGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Swin-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/KAIR)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/JingyunLiang/a5e3e54bc9ef8d7bf594f6fee8208533/swinir-demo-on-real-world-image-sr.ipynb) | 17.06.2022 |
  • Jingyun Liang - wuerzburg.de/computervision/home/)</li> <li>[Luc Van Gool](https://scholar.google.com/citations?user=TwMib_QAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/JingyunLiang/VRT?style=social)](https://github.com/JingyunLiang/VRT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12288)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/KAIR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Video-Swin-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmediting)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/JingyunLiang/deb335792768ad9eb73854a8efca4fe0/vrt-demo-on-video-restoration.ipynb) | 15.06.2022 |
  • Rohit Girdhar - joulin/)</li> <li>[Ishan Misra](https://imisra.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01563)](https://doi.org/10.1109/CVPR52688.2022.01563) [![](https://img.shields.io/github/stars/facebookresearch/omnivore?style=social)](https://github.com/facebookresearch/omnivore) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.08377), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.08356)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/omnivore)</li><li>[project](https://facebookresearch.github.io/omnivore/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/epic-kitchens-100)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/omnivore/blob/main/inference_tutorial.ipynb) | 14.06.2022 |
  • Xingyi Zhou - joulin/)</li> <li>[Philipp Krähenbühl](https://github.com/philkr)</li> <li>[Ishan Misra](https://imisra.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21) [![](https://img.shields.io/github/stars/facebookresearch/Detic?style=social)](https://github.com/facebookresearch/Detic) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.02605)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lvis-dataset/lvis-api)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1QtTW9-ukX2HKZGvt0QvVGqjuqEykoZKI) | 07.06.2022 |
  • Victor Sanh - Jian Jiang](https://github.com/tianjianjiang)</li> <li>[Matteo Manica](https://github.com/drugilsberg)</li> <li>[Sheng Shen](https://sincerass.github.io/)</li> <li>[Zheng Xin Yong](https://yongzx.github.io/)</li> <li>[Harshit Pandey](https://scholar.google.com/citations?user=BPIs78gAAAAJ)</li> <li>[Rachel Bawden](https://rbawden.github.io/)</li> <li>[Trishala Neeraj](https://github.com/trishalaneeraj)</li> <li>[Jos Rozen](https://scholar.google.com/citations?user=OxEDKogAAAAJ)</li> <li>[Abheesht Sharma](https://github.com/abheesht-sharma)</li> <li>[Andrea Santilli](https://teelinsan.github.io/)</li> <li>[Thibault Fevry](http://thibaultfevry.com/)</li> <li>[Jason Alan Fries](https://web.stanford.edu/~jfries/)</li> <li>[Ryan Teehan](https://github.com/rteehas)</li> <li>[Stella Biderman](https://www.stellabiderman.com/)</li> <li>[Leo Gao](https://github.com/leogao2)</li> <li>[Tali Bers](https://github.com/tbers-coursera)</li> <li>[Thomas Wolf](https://thomwolf.io/)</li> <li>[Alexander M. Rush](https://scholar.google.com/citations?user=LIjnUGgAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/bigscience-workshop/promptsource?style=social)](https://github.com/bigscience-workshop/promptsource) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.08207)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iJ0IVZgGjTM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YToXXfrIu6w)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1xx7SgdLaAu23YFBirXmaQViDr8caowX_) | 29.05.2022 |
  • Fangzhou Hong - badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094) [![](https://img.shields.io/github/stars/hongfz16/AvatarCLIP?style=social)](https://github.com/hongfz16/AvatarCLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.08535), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01455), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03221), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.05139), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13333)</li><li>[data](https://www.di.ens.fr/willow/research/surreal/data/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/daniilidis-group/neural_renderer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/GuyTevet/MotionCLIP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Totoro97/NeuS), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nghorbani/human_body_prior)</li><li>[project](https://hongfz16.github.io/projects/AvatarCLIP.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-l2ZMeoASGY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dfaecX7xF3nP6fyXc8XBljV5QY1lc1TR) | 15.05.2022 |
  • Oscar Michel - On](https://github.com/roibaron)</li> <li>[Richard Liu](https://github.com/factoryofthesun)</li> <li>[Sagie Benaim](https://sagiebenaim.github.io/)</li> <li>[Rana Hanocka](http://people.cs.uchicago.edu/~ranahanocka/)</li></ul> | [![](https://img.shields.io/github/stars/threedle/text2mesh?style=social)](https://github.com/threedle/text2mesh) <ul><li>[CLIP](https://openai.com/blog/clip/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03221)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/neverix/text2mesh/notebook)</li><li>[project](https://threedle.github.io/text2mesh/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/threedle/text2mesh/blob/master/colab_demo.ipynb) | 14.05.2022 |
  • Colin Raffel - research/text-to-text-transfer-transformer?style=social)](https://github.com/google-research/text-to-text-transfer-transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.10683)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/mesh/tree/master/mesh_tensorflow/transformer)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) | 11.05.2022 |
  • Arun Babu - 4IAAAAJ)</li> <li>[Alexei Baevski](https://github.com/alexeib)</li> <li>[Alexis Conneau](https://github.com/aconneau)</li> <li>[Michael Auli](https://github.com/michaelauli)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/xlsr/README.md) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09296)</li><li>[blog post](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLS_R_on_Common_Voice.ipynb) | 10.05.2022 |
  • Yung-Sung Chuang - ms.mit.edu/rumen.html)</li> <li>[Hongyin Luo](https://luohongyin.github.io/)</li> <li>[Yang Zhang](https://mitibmwatsonailab.mit.edu/people/yang-zhang/)</li><details><summary>others</summary><li>[Shiyu Chang](https://code-terminator.github.io/)</li> <li>[Marin Soljačić](http://www.mit.edu/~soljacic/marin.html)</li> <li>[Shang-Wen Li](https://swdanielli.github.io/)</li> <li>[Scott Wen-tau Yih](https://scottyih.org/)</li> <li>[Yoon Kim](https://people.csail.mit.edu/yoonkim/)</li> <li>[James Glass](http://groups.csail.mit.edu/sls/people/glass.shtml)</li></ul></details> | [![](https://img.shields.io/github/stars/voidism/diffcse?style=social)](https://github.com/voidism/diffcse) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.10298), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.08821), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.00899)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/princeton-nlp/SimCSE)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/voidism)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/YungSungChuang/status/1517518077902000129)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/voidism/DiffCSE/blob/master/diffcse_evaluation.ipynb) | 24.04.2022 |
  • Hwanjun Song - heo/home)</li> <li>[Wonjae Kim](https://wonjae.kim/)</li> <li>[Ming-Hsuan Yang](http://faculty.ucmerced.edu/mhyang/)</li></ul></details> | [![](https://img.shields.io/github/stars/naver-ai/vidt?style=social)](https://github.com/naver-ai/vidt/tree/vidt-plus) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.07962), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.03921)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fundamentalvision/Deformable-DETR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/EherSenaw/ViDT_colab)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/EherSenaw/ViDT_colab/blob/main/vidt_colab.ipynb) | 20.04.2022 |
  • Liangyu Chen - cfoAAAAJ)</li> <li>[Jian Sun](http://www.jiansun.org/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2) [![](https://img.shields.io/github/stars/megvii-research/NAFNet?style=social)](https://github.com/megvii-research/NAFNet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.04676), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.08714)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/image-deblurring-on-gopro?p=simple-baselines-for-image-restoration), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/image-denoising-on-sidd?p=simple-baselines-for-image-restoration)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dkO5AyktmBoWwxBwoKFUurIDn0m4qDXT) | 15.04.2022 |
  • Yinhuai Wang - hu/)</li> <li>[Jian Zhang](http://jianzhang.tech/)</li></ul> | [![](https://img.shields.io/github/stars/jianzhangcs/panini?style=social)](https://github.com/jianzhangcs/panini) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.08444)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tkarras/progressive_growing_of_gans)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/GeeveGeorge/Panini-Net-Colab/blob/main/PaniniNet_Working.ipynb) | 13.04.2022 |
  • Fujun Luan - shechtman/)</li> <li>[Kavita Bala](https://www.cs.cornell.edu/~kb/)</li></ul> | [![](https://img.shields.io/github/stars/luanfujun/deep-painterly-harmonization?style=social)](https://github.com/luanfujun/deep-painterly-harmonization) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1804.03189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1701.08893)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jcjohnson/neural-style), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/torch/torch7), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/szagoruyko/loadcaffe)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/eyaler/5303782669fb43510d398bd346c6e3e6/deep-painterly-harmonization.ipynb) | 07.04.2022 |
  • Zhen Li - Ze Lu](https://github.com/LGYoung)</li> <li>[Jianhua Qin](https://scholar.google.com/citations?&user=TAr7TU4AAAAJ)</li> <li>[Chun-Le Guo](https://scholar.google.com/citations?user=RZLYwR0AAAAJ)</li> <li>[Ming-Ming Cheng](https://mmcheng.net/)</li></ul> | [![](https://img.shields.io/github/stars/MCG-NKU/E2FGVI?style=social)](https://github.com/MCG-NKU/E2FGVI) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.02663)</li><li>[data](https://competitions.codalab.org/competitions/19544#participate-get-data), [data](https://data.vision.ee.ethz.ch/csergi/share/davis/DAVIS-2017-trainval-480p.zip)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/researchmm/STTN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Focal-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ruiliu-ai/FuseFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/phoenix104104/fast_blind_video_consistency#evaluation)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/mlearning-ai/end-to-end-framework-for-flow-guided-video-inpainting-c5e2d8b61d20)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/N--qC3T2wc4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3eH3Fm6gOFk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12rwY2gtG8jVWlNx9pjmmM8uGmh5ue18G) | 06.04.2022 |
  • Robin Rombach - qp)</li> <li>[Patrick Esser](https://github.com/pesser)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042) [![](https://img.shields.io/github/stars/CompVis/latent-diffusion?style=social)](https://github.com/CompVis/latent-diffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10752), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09778), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.02114)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fyu/lsun), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/guided-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/denoising-diffusion-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/x-transformers)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/multimodalart/latentdiffusion)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/latent-diffusion/blob/master/scripts/latent_imagenet_diffusion.ipynb) | 04.04.2022 |
  • Shuai Yang - jiang.com/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://img.shields.io/github/stars/williamyang1991/GP-UNIT?style=social)](https://github.com/williamyang1991/GP-UNIT) <ul><li>[ImageNet](https://image-net.org/download.php)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.03641)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/clovaai/stargan-v2#datasets-and-pre-trained-networks), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/switchablenorms/CelebAMask-HQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/metfaces-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TreB1eN/InsightFace_Pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/SPADE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nvlabs/imaginaire), [<img src="images/git.svg" alt="git" height=20/>](https://doi.org/10.1109/CVPR52688.2022.01779)</li><li>[project](https://www.mmlab-ntu.com/project/gpunit/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dDApWs_oDrM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/williamyang1991/GP-UNIT/blob/main/notebooks/inference_playground.ipynb) | 02.04.2022 |
  • Shuai Yang - jiang.com/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00754)](https://doi.org/10.1109/CVPR52688.2022.00754) [![](https://img.shields.io/github/stars/williamyang1991/DualStyleGAN?style=social)](https://github.com/williamyang1991/DualStyleGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13248)</li><li>[data](https://cs.nju.edu.cn/rl/WebCaricature.htm), [data](https://www.gwern.net/Crops#danbooru2019-portraits)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lowfuel/progrock-stable), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TreB1eN/InsightFace_Pytorch)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Gradio-Blocks/DualStyleGAN), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/hysts/DualStyleGAN)</li><li>[project](https://www.mmlab-ntu.com/project/dualstylegan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/scZTu77jixI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/williamyang1991/DualStyleGAN/blob/master/notebooks/inference_playground.ipynb) | 24.03.2022 |
  • Yael Vinker - bo.github.io/)</li> <li>[Roman Bachmann](https://roman-bachmann.github.io/)</li><details><summary>others</summary><li>[Amit Bermano](https://www.cs.tau.ac.il/~amberman/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li> <li>[Amir Zamir](https://vilab.epfl.ch/zamir/)</li> <li>[Ariel Shamir](https://faculty.runi.ac.il/arik/site/index.asp)</li></ul></details> | [![](https://img.shields.io/github/stars/yael-vinker/CLIPasso?style=social)](https://github.com/yael-vinker/CLIPasso) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.05822), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.14843)</li><li>[demo](https://replicate.com/yael-vinker/clipasso)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BachiLi/diffvg)</li><li>[project](https://clipasso.github.io/clipasso/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yael-vinker/CLIPasso/blob/main/CLIPasso.ipynb) | 21.03.2022 |
  • Roy Or-El - shechtman/)</li><details><summary>others</summary><li>[Jeong Joon Park](https://jjparkcv.github.io/)</li> <li>[Ira Kemelmacher-Shlizerman](https://www.irakemelmacher.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/royorel/StyleSDF?style=social)](https://github.com/royorel/StyleSDF) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.11427)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/yenchenlin/nerf-pytorch)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/SerdarHelli/StyleSDF-3D)</li><li>[project](https://stylesdf.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/royorel/StyleSDF/blob/main/StyleSDF_demo.ipynb) | 05.03.2022 |
  • Sen He - hannover.de/en/staff/liao/)</li> <li>[Michael Yang](https://sites.google.com/site/michaelyingyang/)</li> <li>[Yi-Zhe Song](http://personal.ee.surrey.ac.uk/Personal/Y.Song/)</li><details><summary>others</summary><li>[Bodo Rosenhahn](https://scholar.google.com/citations?user=qq3TxtcAAAAJ)</li> <li>[Tao Xiang](http://personal.ee.surrey.ac.uk/Personal/T.Xiang/index.html)</li></ul></details> | [![](https://img.shields.io/github/stars/SenHe/DLFS?style=social)](https://github.com/SenHe/DLFS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.02874)</li><li>[project](https://senhe.github.io/projects/iccv_2021_lifespan_face/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=uklX03ns0m0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fgVAoxCSaqPkj0rUK4RmBh7GTQRqLNpE) | 22.02.2022 |
  • Bowen Cheng - schwing.de/)</li> <li>[Alexander Kirillov](https://alexander-kirillov.github.io/)</li> <li>[Rohit Girdhar](https://rohitgirdhar.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135) [![](https://img.shields.io/github/stars/facebookresearch/Mask2Former?style=social)](https://github.com/facebookresearch/Mask2Former) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01527), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10764)</li><li>[demo](https://replicate.com/facebookresearch/mask2former)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/MaskFormer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/Mask2Former)</li><li>[project](https://bowenc0221.github.io/mask2former/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1uIWE5KbGFSjrxey2aRd5pWkKNY1_SaNq) | 09.02.2022 |
  • Vladimir Iashin - iashin/SpecVQGAN?style=social)](https://github.com/v-iashin/SpecVQGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/2110.08791), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1711.00937), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.00820), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.01393), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1512.08512)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/PeihaoChen/regnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/toshas/torch-fidelity), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/descriptinc/melgan-neurips), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/lyra)</li><li>[project](https://iashin.ai/SpecVQGAN)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Foley_(filmmaking)), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Row-_and_column-major_order), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Bucb3nAa398)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1pxTIMweAKApJZ3ZFqyBee3HtMqFpnwQ0) | 03.02.2022 |
  • Min Jin Chong - pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/replicate/cog)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mchong6/JoJoGAN/blob/master/stylize.ipynb) | 02.02.2022 |
  • Badour AlBahar - lu/)</li> <li>[Jimei Yang](https://github.com/jimeiyang)</li> <li>[Zhixin Shu](https://zhixinshu.github.io/)</li><details><summary>others</summary><li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/BadourAlBahar/pose-with-style?style=social)](https://github.com/BadourAlBahar/pose-with-style) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.06166)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://pose-with-style.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/d_ETeAVLilw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tg-bomze/collection-of-notebooks/blob/master/HomeStylist.ipynb) | 19.01.2022 |
  • Zhuang Liu - Yuan Wu](https://chaoyuan.org/)</li> <li>[Christoph Feichtenhofer](https://feichtenhofer.github.io/)</li><details><summary>others</summary><li>[Trevor Darrell](https://people.eecs.berkeley.edu/~trevor/)</li> <li>[Saining Xie](https://www.sainingxie.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167) [![](https://img.shields.io/github/stars/facebookresearch/ConvNeXt?style=social)](https://github.com/facebookresearch/ConvNeXt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.03545)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/deit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/unilm/tree/master/beit)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/convnext)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QzCjXqFnWPE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/idiIllIQOfU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QqejV0LNDHA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1CBYTIZ4tBMsVL5cqu9N_-Q3TBprqsfEO) | 19.01.2022 |
  • Felix Petersen - konstanz.de/personen/prof-dr-oliver-deussen/)</li></ul> | [![](https://img.shields.io/github/stars/Felix-Petersen/diffsort?style=social)](https://github.com/Felix-Petersen/diffsort) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.04019), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.09630)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Rl-sFaE1z4M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1q0TZFFYB9FlOJYWKt0_7ZaXQT190anhm) | 17.01.2022 |
  • Patrick Esser - lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268) [![](https://img.shields.io/github/stars/CompVis/taming-transformers?style=social)](https://github.com/CompVis/taming-transformers) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841)</li><li>[project](https://compvis.github.io/taming-transformers/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/taming-transformers/blob/master/scripts/taming-transformers.ipynb) | 13.01.2022 |
  • Xingchao Liu - Hi8STUrLc2m4DeOviv7NO) | 02.01.2022 |
  • Alex Nichol - QMwAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/openai/glide-text2im?style=social)](https://github.com/openai/glide-text2im) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10741)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ItKi3h7IY2o)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/glide-text2im/blob/master/notebooks/inpaint.ipynb) | 22.12.2021 |
  • Keunhong Park - Brualla](https://ricardomartinbrualla.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00581)](https://doi.org/10.1109/ICCV48922.2021.00581) [![](https://img.shields.io/github/stars/google/nerfies?style=social)](https://github.com/google/nerfies) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.12948)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/google-research/tree/master/jaxnerf)</li><li>[project](https://nerfies.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/photogrammetry/comments/k1i0ct/deformable_neural_radiance_fields_nerfies/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MrKrnHhk8IA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IDMiMKWucaI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/nerfies/blob/main/notebooks/Nerfies_Capture_Processing.ipynb) | 06.12.2021 |
  • Weihao Yu - sg/poolformer?style=social)](https://github.com/sail-sg/poolformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.11418)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fvcore), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/apex)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/poolformer)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sail-sg/poolformer/blob/main/misc/poolformer_demo.ipynb) | 05.12.2021 |
  • Yuval Alaluf - badge.php?doi=10.1109/CVPR52688.2022.01796)](https://doi.org/10.1109/CVPR52688.2022.01796) [![](https://img.shields.io/github/stars/yuval-alaluf/hyperstyle?style=social)](https://github.com/yuval-alaluf/hyperstyle) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.15666), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.03189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09036), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.07727)</li><li>[data](https://ai.stanford.edu/~jkrause/cars/car_dataset.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/clovaai/stargan-v2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TreB1eN/InsightFace_Pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HuangYG123/CurricularFace), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lessw2020/Ranger-Deep-Learning-Optimizer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dvschultz/stylegan2-ada-pytorch)</li><li>[project](https://yuval-alaluf.github.io/hyperstyle/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_sbXmLY2jMw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yuval-alaluf/hyperstyle/blob/master/notebooks/inference_playground.ipynb) | 03.12.2021 |
  • Omer Tov - alaluf.github.io/)</li> <li>[Yotam Nitzan](https://yotamnitzan.github.io/)</li> <li>[Or Patashnik](https://orpatashnik.github.io/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459838)](https://doi.org/10.1145/3450626.3459838) [![](https://img.shields.io/github/stars/omertov/encoder4editing?style=social)](https://github.com/omertov/encoder4editing) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.02766)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/pixel2style2pixel)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/omertov/encoder4editing/blob/master/notebooks/inference_playground.ipynb) | 02.12.2021 |
  • Wonjong Jang - us/research/people/xtong/)</li> <li>[Seungyong Lee](https://scholar.google.com/citations?user=yGPH-nAAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/wonjongg/StyleCariGAN?style=social)](https://github.com/wonjongg/StyleCariGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.04331)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://wonjongg.github.io/StyleCariGAN/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=kpHbGOlI-BU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1HDRQGm7pvC9mAb6Lktoft_SmY9sCq_Qg) | 30.11.2021 |
  • Tobias Sunderdiek - badge.php?doi=10.1109/CVPR.2018.00986)](https://doi.org/10.1109/CVPR.2018.00986) <ul><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/alamson/safebooru)</li><li>[project](https://tobiassunderdiek.github.io/cartoon-gan/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/TobiasSunderdiek/cartoon-gan/blob/master/CartoonGAN.ipynb) | 24.11.2021 |
  • Xuanhong Chen - badge.php?doi=10.1145/3394171.3413630)](https://doi.org/10.1145/3394171.3413630) [![](https://img.shields.io/github/stars/neuralchen/SimSwap?style=social)](https://github.com/neuralchen/SimSwap) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.06340)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepinsight/insightface)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/neuralchen/SimSwap/blob/master/SimSwap%20colab.ipynb) | 24.11.2021 |
  • Shanchuan Lin - mGCEYEzM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/10z-pNKRnVNsp0Lq9tH1J_XPZ7CBC_uHm) | 24.11.2021 |
  • Shanchuan Lin - saleemi)</li> <li>[Soumyadip Sengupta](https://github.com/senguptaumd)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/WACV51458.2022.00319)](https://doi.org/10.1109/WACV51458.2022.00319) [![](https://img.shields.io/github/stars/PeterL1n/RobustVideoMatting?style=social)](https://github.com/PeterL1n/RobustVideoMatting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.11515)</li><li>[project](https://peterl1n.github.io/RobustVideoMatting)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/pdbpmg/r_robust_highresolution_video_matting_with/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Jvzltozpbpk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ay-mGCEYEzM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VL-0K6HjhvQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Jhuf6M_VrBI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_oN9yyRi3HY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/10z-pNKRnVNsp0Lq9tH1J_XPZ7CBC_uHm) | 24.11.2021 |
  • Xin Chen - badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18) [![](https://img.shields.io/github/stars/bryandlee/animegan2-pytorch?style=social)](https://github.com/bryandlee/animegan2-pytorch) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/TachibanaYoshino/AnimeGANv2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TachibanaYoshino/AnimeGAN)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/AnimeGANv2)</li><li>[project](https://tachibanayoshino.github.io/AnimeGANv2/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bryandlee/animegan2-pytorch/blob/master/colab_demo.ipynb) | 17.11.2021 |
  • Min Jin Chong - Ying Lee](http://hsinyinglee.com/)</li> <li>[David Forsyth](http://luthuli.cs.uiuc.edu/~daf/)</li></ul> | [![](https://img.shields.io/github/stars/mchong6/SOAT?style=social)](https://github.com/mchong6/SOAT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.01619)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/justinpinkney/toonify), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/SOAT)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mchong6/SOAT/blob/master/infinity.ipynb) | 13.11.2021 |
  • Chrisantha Fernando - Baptiste Alayrac](https://www.jbalayrac.com/)</li> <li>[Piotr Mirowski](https://piotrmirowski.com/)</li><details><summary>others</summary><li>[Dylan Banarse](https://www.2ne1.com/)</li> <li>[Simon Osindero](https://scholar.google.com/citations?user=Jq8ZS5kAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/arnheim?style=social)](https://github.com/deepmind/arnheim) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.00162), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.14843), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.07729), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.02580), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1609.09106)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/dall-e)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Compositional_pattern-producing_network)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=U7guaMdeF4g), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=zh0goLbS-l0), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=SYJGNt7yu6M), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=MxkYKa0x5AU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/arnheim/blob/master/arnheim_2.ipynb) | 11.11.2021 |
  • Mikael Christensen - badge.php?doi=10.1109/CVPR42600.2020.00813)](https://doi.org/10.1109/CVPR42600.2020.00813) [![](https://img.shields.io/github/stars/NVlabs/stylegan2?style=social)](https://github.com/NVlabs/stylegan2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1912.04958)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/c-NJtV9Jvp0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1ShgW6wohEFQtqs_znMna3dzrcVoABKIH) | 05.11.2021 |
  • Yifu Zhang - fmh.github.io/)</li><details><summary>others</summary><li>[Ping Luo](http://luoping.me/)</li> <li>[Xinggang Wang](https://xinggangw.info/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1) [![](https://img.shields.io/github/stars/ifzhang/ByteTrack?style=social)](https://github.com/ifzhang/ByteTrack) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.06864)</li><li>[data](https://motchallenge.net/), [data](https://www.crowdhuman.org/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Megvii-BaseDetection/YOLOX), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ifzhang/FairMOT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PeizeSun/TransTrack), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/samylee/Towards-Realtime-MOT-Cpp)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/multi-object-tracking)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1bDilg4cmXFa8HCKHbsZ_p16p0vrhLyu0) | 30.10.2021 |
  • Max Woolf - 2?style=social)](https://github.com/openai/gpt-2) <ul><li>[blog post](https://minimaxir.com/2019/09/howto-gpt2/), [blog post](https://openai.com/research/better-language-models)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/minimaxir/gpt-2-simple)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/aqlzde/r_openai_better_language_models_and_their/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1VLG8e7YSEwypxU-noRNhsv5dW4NfTGce) | 18.10.2021 |
  • Asher Trockman - cifar10), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codex/an-overview-on-convmixer-patches-are-all-you-need-8502a8d87011)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Gl0s0GDqN3c?t=990)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/locuslab/convmixer/blob/main/pytorch-image-models/notebooks/EffResNetComparison.ipynb) | 05.10.2021 |
  • Arantxa Casanova - careil-901804155)</li> <li>[Jakob Verbeek](http://thoth.inrialpes.fr/~verbeek/)</li> <li>[Michał Drożdżal](https://scholar.google.com/citations?user=XK_ktwQAAAAJ)</li> <li>[Adriana Romero-Soriano](https://sites.google.com/site/adriromsor)</li></ul> | [![](https://img.shields.io/github/stars/facebookresearch/ic_gan?style=social)](https://github.com/facebookresearch/ic_gan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.05070)</li><li>[blog post](https://ai.facebook.com/blog/instance-conditioned-gans/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/faiss), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ajbrock/BigGAN-PyTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/bioinf-jku/TTUR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mit-han-lab/data-efficient-gans)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/e7ac288b0f2d41445904d071ba37aaff-Abstract.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/ic_gan/blob/master/inference/icgan_colab.ipynb) | 01.10.2021 |
  • Suman Ravuri - willson-6a1b422)</li> <li>[Dmitry Kangin](https://scholar.google.com/citations?user=vv-leaMAAAAJ)</li><details><summary>others</summary><li>[Rémi Lam](https://scholar.google.com/citations?user=Sm7xCbEAAAAJ)</li> <li>[Piotr Mirowski](https://piotrmirowski.com/)</li> <li>[Maria Athanassiadou](https://scholar.google.com/citations?user=VtkgHP0AAAAJ)</li> <li>[Sheleem Kashem](https://www.linkedin.com/in/sheleemkashem/)</li> <li>[Rachel Prudden](https://computerscience.exeter.ac.uk/staff/rep218)</li> <li>[Amol Mandhane](https://github.com/amol-mandhane)</li> <li>[Aidan Clark](https://scholar.google.com/citations?user=_19DrfIAAAAJ)</li> <li>[Andrew Brock](https://github.com/ajbrock)</li> <li>[Karen Simonyan](https://scholar.google.com/citations?user=L7lMQkQAAAAJ)</li> <li>[Raia Hadsell](https://github.com/raiah)</li> <li>[Niall Robinson](https://github.com/niallrobinson)</li> <li>[Ellen Clancy](https://www.linkedin.com/in/ellen-clancy-815967124)</li> <li>[Shakir Mohamed](https://www.shakirm.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03854-z)](https://doi.org/10.1038/s41586-021-03854-z) [![](https://img.shields.io/github/stars/deepmind/deepmind-research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/nowcasting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.00954)</li><li>[blog post](https://deepmind.com/blog/article/nowcasting)</li><li>[local kernel](https://research.google.com/colaboratory/local-runtimes.html)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/hub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind-research/blob/master/nowcasting/Open_sourced_dataset_and_model_snapshot_for_precipitation_nowcasting.ipynb) | 29.09.2021 |
  • Yuanxun Lu - head-Generation-with-Rhythmic-Head-Motion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DinoMan/speech-driven-animation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix)</li><li>[project](https://yuanxunlu.github.io/projects/LiveSpeechPortraits/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1tKvi-9kY3GkEK8lgtfTSM70rMFo_TY50) | 26.09.2021 |
  • Oran Lang - badge.php?doi=10.1109/ICCV48922.2021.00073)](https://doi.org/10.1109/ICCV48922.2021.00073) [![](https://img.shields.io/github/stars/google/explaining-in-style?style=social)](https://github.com/google/explaining-in-style) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.13369), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.10112), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.12799), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.04958), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.01711)</li><li>[blog post](https://ai.googleblog.com/2022/01/introducing-stylex-new-approach-for.html)</li><li>[project](https://explaining-in-style.github.io/)</li><li>[supplementary](https://explaining-in-style.github.io/supmat.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wLk2eBdXH4M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/explaining-in-style/blob/main/Explaining_in_Style_AttFind.ipynb) | 25.08.2021 |
  • Jaehyeon Kim - demo/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1CO61pZizDj7en71NQG_aqqKdGaA_SaBf) | 23.08.2021 |
  • Ziyu Wan - zhang.me/)</li> <li>[Dongdong Chen](http://www.dongdongchen.bid/)</li> <li>[Pan Zhang](https://panzhang0212.github.io/)</li><details><summary>others</summary><li>[Dong Chen](http://www.dongchen.pro/)</li> <li>[Jing Liao](https://liaojing.github.io/html/)</li> <li>[Fang Wen](https://www.microsoft.com/en-us/research/people/fangwen/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/Bringing-Old-Photos-Back-to-Life?style=social)](https://github.com/microsoft/Bringing-Old-Photos-Back-to-Life) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.09484)</li><li>[demo](https://replicate.com/microsoft/bringing-old-photos-back-to-life)</li><li>[project](http://raywzy.com/Old_Photo/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Q5bhszQq9eA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1NEm6AsybIiC5TwTU_4DqDkQO0nFRB-uA) | 13.07.2021 |
  • Daniel Roich - Or](https://danielcohenor.com/)</li></ul> | [![](https://img.shields.io/github/stars/danielroich/PTI?style=social)](https://github.com/danielroich/PTI) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.05744)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/richzhang/PerceptualSimilarity)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/danielroich/PTI/blob/main/notebooks/inference_playground.ipynb) | 01.07.2021 |
  • Weihao Xia - Hao Xue](http://www.homepages.ucl.ac.uk/~ucakjxu/)</li> <li>[Baoyuan Wu](https://sites.google.com/site/baoyuanwu2015/home)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00229)](https://doi.org/10.1109/CVPR46437.2021.00229) [![](https://img.shields.io/github/stars/IIGROUP/TediGAN?style=social)](https://github.com/IIGROUP/TediGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.03308), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.08910)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/weihaox/Multi-Modal-CelebA-HQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fyu/lsun)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L8Na2f5viAM)</li></ul> | [![Open In Colab](images/colab.svg)](http://colab.research.google.com/github/weihaox/TediGAN/blob/master/playground.ipynb) | 30.06.2021 |
  • Ming Ding - gEAAAAJ)</li> <li>[Wenyi Hong](https://github.com/wenyihong)</li> <li>[Wendi Zheng](https://github.com/minkowski0125)</li><details><summary>others</summary><li>[Chang Zhou](https://scholar.google.com/citations?user=QeSoG3sAAAAJ)</li> <li>[Junyang Lin](https://justinlin610.github.io/)</li> <li>[Xu Zou](http://xuzou.cn/)</li> <li>[Zhou Shao](https://www.researchgate.net/profile/Shao_Zhou4)</li> <li>[Hongxia Yang](https://sites.google.com/site/hystatistics/home)</li> <li>[Jie Tang](https://keg.cs.tsinghua.edu.cn/jietang/)</li></ul></details> | [![](https://img.shields.io/github/stars/THUDM/CogView?style=social)](https://github.com/THUDM/CogView) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.13290)</li><li>[demo](https://thudm.github.io/CogView/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/apex), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Sleepychord/cogdata)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/cogview-image-generation-and-language-modelling-at-scale-8d358a0686d2)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/a4d92e2cd541fca87e4620aba658316d-Abstract.html)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/nmxsd8/r_cogview_mastering_texttoimage_generation_via/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Cw1r8ACIj8U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Bi2TnSUp2vNiSUhamsNuC4HqkZ2J4WwZ) | 21.06.2021 |
  • Min Jin Chong - pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/znxlwm/UGATIT-pytorch)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VNg0NyCGl_4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mchong6/GANsNRoses/blob/master/inference_colab.ipynb) | 19.06.2021 |
  • Dmytro Kotovenko - wright.github.io/)</li> <li>[Arthur Heimbrecht](https://github.com/arwehei)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01202)](https://doi.org/10.1109/CVPR46437.2021.01202) [![](https://img.shields.io/github/stars/CompVis/brushstroke-parameterized-style-transfer?style=social)](https://github.com/CompVis/brushstroke-parameterized-style-transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17185)</li><li>[project](https://compvis.github.io/brushstroke-parameterized-style-transfer/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/brushstroke-parameterized-style-transfer/blob/tensorflow_v2/notebooks/BrushstrokeStyleTransfer_TF2.ipynb) | 02.06.2021 |
  • Elad Richardson - alaluf.github.io/)</li> <li>[Yotam Nitzan](https://yotamnitzan.github.io/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00232)](https://doi.org/10.1109/CVPR46437.2021.00232) [![](https://img.shields.io/github/stars/eladrich/pixel2style2pixel?style=social)](https://github.com/eladrich/pixel2style2pixel) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.00951)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HuangYG123/CurricularFace)</li><li>[project](https://eladrich.github.io/pixel2style2pixel/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bfvSwhqsTgM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eladrich/pixel2style2pixel/blob/master/notebooks/inference_playground.ipynb) | 01.06.2021 |
  • Chen Chen - fen)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/N19-1423)](https://doi.org/10.18653/v1/N19-1423) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.04805)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.org/hub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/models/blob/master/official/colab/fine_tuning_bert.ipynb) | 24.05.2021 |
  • Yuval Alaluf - Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00664)](https://doi.org/10.1109/ICCV48922.2021.00664) [![](https://img.shields.io/github/stars/yuval-alaluf/restyle-encoder?style=social)](https://github.com/yuval-alaluf/restyle-encoder) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.02699), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.00951), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.02766)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TreB1eN/InsightFace_Pytorch)</li><li>[project](https://yuval-alaluf.github.io/restyle-encoder/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yuval-alaluf/restyle-encoder/blob/master/notebooks/inference_playground.ipynb) | 21.05.2021 |
  • Aliaksandr Siarohin - badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520) [![](https://img.shields.io/github/stars/AliaksandrSiarohin/motion-cosegmentation?style=social)](https://github.com/AliaksandrSiarohin/motion-cosegmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/2004.03234)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/video-preprocessing)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=RJ4Nj1wV5iA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/AliaksandrSiarohin/motion-cosegmentation/blob/master/part_swap.ipynb) | 07.04.2020 |
  • Yuval Alaluf - Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459805)](https://doi.org/10.1145/3450626.3459805) [![](https://img.shields.io/github/stars/yuval-alaluf/SAM?style=social)](https://github.com/yuval-alaluf/SAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.02754)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/pixel2style2pixel), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://yuval-alaluf.github.io/SAM/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X_pYC_LtBFw)</li></ul> | [![Open In Colab](images/colab.svg)](http://colab.research.google.com/github/yuval-alaluf/SAM/blob/master/notebooks/animation_inference_playground.ipynb) | 26.04.2021 |
  • Robin Rombach - lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.01409)](https://doi.org/10.1109/ICCV48922.2021.01409) [![](https://img.shields.io/github/stars/CompVis/geometry-free-view-synthesis?style=social)](https://github.com/CompVis/geometry-free-view-synthesis) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.07652)</li><li>[data](https://google.github.io/realestate10k/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/colmap/colmap)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/geometry-free-view-synthesis/blob/master/scripts/braindance.ipynb) | 22.04.2021 |
  • Yu-Lun Liu - Sheng Lai](https://www.wslai.net/)</li> <li>[Ming-Hsuan Yang](https://faculty.ucmerced.edu/mhyang/)</li> <li>[Yung-Yu Chuang](https://www.csie.ntu.edu.tw/~cyy/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00230)](https://doi.org/10.1109/ICCV48922.2021.00230) [![](https://img.shields.io/github/stars/alex04072000/NeRViS?style=social)](https://github.com/alex04072000/NeRViS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06205)</li><li>[data](http://liushuaicheng.org/SIGGRAPH2013/database.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cxjyxxme/deep-online-video-stabilization), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jinsc37/DIFRINT)</li><li>[project](https://alex04072000.github.io/NeRViS/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KO3sULs4hso)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1l-fUzyM38KJMZyKMBWw_vu7ZUyDwgdYH) | 11.04.2021 |
  • Suttisak Wizadwongsa - yenphraphai-990ba6175/)</li> <li>[Supasorn Suwajanakorn](https://www.supasorn.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00843)](https://doi.org/10.1109/CVPR46437.2021.00843) [![](https://img.shields.io/github/stars/nex-mpi/nex-code?style=social)](https://github.com/nex-mpi/nex-code) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.05606)</li><li>[data](https://vistec-my.sharepoint.com/personal/pakkapon_p_s19_vistec_ac_th/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fpakkapon%5Fp%5Fs19%5Fvistec%5Fac%5Fth%2FDocuments%2Fpublic%2FVLL%2FNeX%2Fshiny%5Fdatasets&originalPath=aHR0cHM6Ly92aXN0ZWMtbXkuc2hhcmVwb2ludC5jb20vOmY6L2cvcGVyc29uYWwvcGFra2Fwb25fcF9zMTlfdmlzdGVjX2FjX3RoL0VuSVVoc1JWSk9kTnNaXzRzbWRoeWUwQjh6MFZseHFPUjM1SVIzYnAwdUd1cFE%5FcnRpbWU9WXRVQTQtQTcyVWc), [data](https://vistec-my.sharepoint.com/personal/pakkapon_p_s19_vistec_ac_th/_layouts/15/onedrive.aspx?originalPath=aHR0cHM6Ly92aXN0ZWMtbXkuc2hhcmVwb2ludC5jb20vOmY6L2cvcGVyc29uYWwvcGFra2Fwb25fcF9zMTlfdmlzdGVjX2FjX3RoL0VyalBSUkw5Sm5GSXA4TU42ZDFqRXVvQjNYVm94SmtmZlBqZm9QeWhIa2owZGc%5FcnRpbWU9bC0yYWctRTcyVWc&id=%2Fpersonal%2Fpakkapon%5Fp%5Fs19%5Fvistec%5Fac%5Fth%2FDocuments%2Fpublic%2FVLL%2FNeX%2Fmodified%5Fdataset)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fyusion/LLFF)</li><li>[project](https://nex-mpi.github.io/)</li><li>[vistec](https://vistec.ist/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=HyfkF7Z-ddA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1hXVvYdAwLA0EFg2zrafJUE0bFgB_F7PU) | 25.03.2021 |
  • Yang Song - Dickstein](http://www.sohldickstein.com/)</li> <li>[Diederik Kingma](http://dpkingma.com/)</li> <li>[Abhishek Kumar](https://abhishek.umiacs.io/)</li><details><summary>others</summary><li>[Stefano Ermon](https://cs.stanford.edu/~ermon/)</li> <li>[Ben Poole](https://cs.stanford.edu/~poole/)</li></ul></details> | [![](https://img.shields.io/github/stars/yang-song/score_sde?style=social)](https://github.com/yang-song/score_sde) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.13456), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.05600), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.09011), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.11239)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/yang-song/score_sde_pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/ml_collections)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L9ZegT87QK8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yang-song/score_sde/blob/main/Score_SDE_demo.ipynb) | 18.03.2021 |
  • Pramook Khungurn - head-anime-demo?style=social)](https://github.com/pkhungurn/talking-head-anime-demo) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lincolnhard/head-pose-estimation)</li><li>[project](https://pkhungurn.github.io/talking-head-anime/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Virtual_YouTuber), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/MikuMikuDance)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kMQCERkTdO0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/T1Gp-RxFZwU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FioRJ6x_RbI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/pkhungurn/talking-head-anime-demo/blob/master/tha_colab.ipynb) | 23.02.2021 |
  • Andrew Brock - research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/nfnets) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06171), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.08692)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/jaxline)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rNkHjZtH0RQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/qyy2WhRRSI4?feature=share)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind-research/blob/master/nfnets/nfnet_demo_colab.ipynb) | 17.02.2021 |
  • Konstantin Sofiiuk - inf.mpg.de/people/Petrov.html)</li> <li>[Anton Konushin](https://scholar.google.com/citations?user=ZT_k-wMAAAAJ)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897365)](https://doi.org/10.1109/ICIP46576.2022.9897365) [![](https://img.shields.io/github/stars/SamsungLabs/ritm_interactive_segmentation?style=social)](https://github.com/SamsungLabs/ritm_interactive_segmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06583)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/HRNet/HRNet-Image-Classification)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/interactive-segmentation-on-grabcut?p=reviving-iterative-training-with-mask), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/interactive-segmentation-on-berkeley?p=reviving-iterative-training-with-mask)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/SamsungLabs/ritm_interactive_segmentation/blob/master/notebooks/colab_test_any_model.ipynb) | 13.02.2021 |
  • Jong Wook Kim - 2021/Slides/9193.pdf)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/clip/blob/master/Interacting_with_CLIP.ipynb) | 29.01.2021 |
  • Tom Brown - lab/cleverhans/blob/master/examples/adversarial_patch/AdversarialPatch.ipynb) | 27.01.2021 |
  • Hang Zhang - badge.php?doi=10.1007/978-3-030-11018-5_32)](https://doi.org/10.1007/978-3-030-11018-5_32) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.06953)</li><li>[project](http://computervisionrutgers.github.io/MSG-Net/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=oy6pWNWBt4Y)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/zhanghang1989/PyTorch-Multi-Style-Transfer/blob/master/msgnet.ipynb) | 25.01.2021 |
  • Konstantin Sofiiuk - inf.mpg.de/people/Petrov.html)</li> <li>[Olga Barinova](https://github.com/OlgaBarinova)</li> <li>[Anton Konushin](https://scholar.google.com/citations?user=ZT_k-wMAAAAJ)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00865)](https://doi.org/10.1109/CVPR42600.2020.00865) [![](https://img.shields.io/github/stars/SamsungLabs/fbrs_interactive_segmentation?style=social)](https://github.com/SamsungLabs/fbrs_interactive_segmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2001.10331)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/HRNet/HRNet-Image-Classification)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ArcZ5xtyMCk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xg-5J9gLuXA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/SamsungLabs/fbrs_interactive_segmentation/blob/master/notebooks/colab_test_any_model.ipynb) | 25.01.2021 |
  • Somshubra Majumdar - badge.php?doi=10.1167/16.12.326)](https://doi.org/10.1167/16.12.326) [![](https://img.shields.io/github/stars/titu1994/Neural-Style-Transfer?style=social)](https://github.com/titu1994/Neural-Style-Transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1508.06576), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1605.04603), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.05897)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/titu1994/Neural-Style-Transfer/blob/master/NeuralStyleTransfer.ipynb) | 22.01.2021 |
  • Zhengxia Zou - badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717) [![](https://img.shields.io/github/stars/jiupinjia/SkyAR?style=social)](https://github.com/jiupinjia/SkyAR) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11800)</li><li>[project](https://jiupinjia.github.io/skyar/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=zal9Ues0aOQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jiupinjia/SkyAR/blob/master/colab_demo.ipynb) | 18.01.2021 |
  • Prakruti Joshi - music-theory.html)</li><li>[musicXML](https://www.musicxml.com/for-developers/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/MusicXML_Document_Structure_Documentation.ipynb) | 08.01.2021 |
  • Raphael Gontijo Lopes - badge.php?doi=10.1109/ICCV.2019.00802)](https://doi.org/10.1109/ICCV.2019.00802) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.02632)</li><li>[blog post](https://magenta.tensorflow.org/svg-vae)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/vae_svg_decoding.ipynb) | 08.01.2021 |
  • Zhengxia Zou - magic-eye?style=social)](https://github.com/jiupinjia/neural-magic-eye) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.15692)</li><li>[project](https://jiupinjia.github.io/neuralmagiceye/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Fkh7DEblqJ8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1f59dFLJ748i2TleE54RkbUZSMo9Hyx7l) | 01.01.2021 |
  • Chen Gao - Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42) [![](https://img.shields.io/github/stars/vt-vl-lab/FGVC?style=social)](https://github.com/vt-vl-lab/FGVC) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2009.01835)</li><li>[project](http://chengao.vision/FGVC/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=CHHVPxHT7rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1pb6FjWdwq_q445rG2NP0dubw7LKNUkqc) | 30.12.2020 |
  • Muhammed Kocabas - nik)</li> <li>[Michael Black](https://ps.is.mpg.de/person/black)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00530)](https://doi.org/10.1109/CVPR42600.2020.00530) [![](https://img.shields.io/github/stars/mkocabas/VIBE?style=social)](https://github.com/mkocabas/VIBE) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.05656)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/carlosedubarreto/vibe_win_install), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/akanazawa/human_dynamics), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/MandyMo/pytorch_HMR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/soulslicer/STAF/tree/staf)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/3d-human-pose-estimation-on-3dpw?p=vibe-video-inference-for-human-body-pose-and)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3qhs5IRJ1LI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w1biKeiQThY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rIr-nX63dUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fW0sIZfQcIs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8Qt0wA16kTo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xyo5gl5GLEI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XNzgUhxKC38), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hErK0MamTY4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Gfmm8uMfMq0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dFfwxZ52MN86FA6uFNypMEdFShd2euQA) | 23.12.2020 |
  • Yujun Shen - badge.php?doi=10.1109/CVPR46437.2021.00158)](https://doi.org/10.1109/CVPR46437.2021.00158) [![](https://img.shields.io/github/stars/genforce/sefa?style=social)](https://github.com/genforce/sefa) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2007.06600)</li><li>[project](https://genforce.github.io/sefa/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=OFHW2WbXXIQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/genforce/sefa/blob/master/docs/SeFa.ipynb) | 06.12.2020 |
  • Zhengxia Zou - badge.php?doi=10.1109/CVPR46437.2021.01543)](https://doi.org/10.1109/CVPR46437.2021.01543) [![](https://img.shields.io/github/stars/jiupinjia/stylized-neural-painting?style=social)](https://github.com/jiupinjia/stylized-neural-painting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.08114)</li><li>[project](https://jiupinjia.github.io/neuralpainter/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=oerb-nwrXhk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1ch_41GtcQNQT1NLOA21vQJ_rQOjjv9D8) | 01.12.2020 |
  • Alexander Kolesnikov - badge.php?doi=10.1007/978-3-030-58558-7_29)](https://doi.org/10.1007/978-3-030-58558-7_29) [![](https://img.shields.io/github/stars/google-research/big_transfer?style=social)](https://github.com/google-research/big_transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.11370), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.05237)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/google/bit-50)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://sh-tsang.medium.com/review-big-transfer-bit-general-visual-representation-learning-cb4bf8ed9732)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/k1GOF2jmX7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0iTgt5-SOsU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X5Rhm__OxvA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/big_transfer/blob/master/colabs/big_transfer_tf2.ipynb) | 12.11.2020 |
  • Woosung Choi - badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896) [![](https://img.shields.io/github/stars/ws-choi/Conditioned-Source-Separation-LaSAFT?style=social)](https://github.com/ws-choi/Conditioned-Source-Separation-LaSAFT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11631)</li><li>[data](https://sigsep.github.io/datasets/musdb.html)</li><li>[project](https://lasaft.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ws-choi/Conditioned-Source-Separation-LaSAFT/blob/master/colab_demo/LaSAFT_with_GPoCM_Stella_Jang_Example.ipynb) | 01.11.2020 |
  • Roy Or-El - shechtman/)</li> <li>[Ira Kemelmacher-Shlizerman](https://www.irakemelmacher.com/)</li></ul> | [![](https://img.shields.io/github/stars/royorel/Lifespan_Age_Transformation_Synthesis?style=social)](https://github.com/royorel/Lifespan_Age_Transformation_Synthesis) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.09764)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/royorel/FFHQ-Aging-Dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/pix2pixHD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/style-based-gan-pytorch)</li><li>[project](https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_jTFcjN2hBk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9fulnt2_q_Y)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/royorel/Lifespan_Age_Transformation_Synthesis/blob/master/LATS_demo.ipynb) | 31.10.2020 |
  • Ceyuan Yang - badge.php?doi=10.1007/s11263-020-01429-5)](https://doi.org/10.1007/s11263-020-01429-5) [![](https://img.shields.io/github/stars/genforce/higan?style=social)](https://github.com/genforce/higan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.09267), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1412.6856), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.10112)</li><li>[project](https://genforce.github.io/higan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=X5yWu2Jwjpg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/genforce/higan/blob/master/docs/HiGAN_Bedroom.ipynb) | 14.10.2020 |
  • Yujun Shen - badge.php?doi=10.1109/CVPR42600.2020.00926)](https://doi.org/10.1109/CVPR42600.2020.00926) [![](https://img.shields.io/github/stars/genforce/interfacegan?style=social)](https://github.com/genforce/interfacegan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10786), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.09635), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10196)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tkarras/progressive_growing_of_gans), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan)</li><li>[project](https://genforce.github.io/interfacegan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=uoftpl3Bj6w)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/genforce/interfacegan/blob/master/docs/InterFaceGAN.ipynb) | 13.10.2020 |
  • Jheng-Wei Su - badge.php?doi=10.1109/CVPR42600.2020.00799)](https://doi.org/10.1109/CVPR42600.2020.00799) [![](https://img.shields.io/github/stars/ericsujw/InstColorization?style=social)](https://github.com/ericsujw/InstColorization) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.10825)</li><li>[project](https://ericsujw.github.io/InstColorization/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Zj1N4uE1ehk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ericsujw/InstColorization/blob/master/InstColorization.ipynb) | 30.08.2020 |
  • Kaiming He - badge.php?doi=10.1109/CVPR42600.2020.00975)](https://doi.org/10.1109/CVPR42600.2020.00975) [![](https://img.shields.io/github/stars/facebookresearch/moco?style=social)](https://github.com/facebookresearch/moco) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.05722), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.04297), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.02677)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ppwwyyxx/moco.tensorflow)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LvHwBQF14zs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4VVGtYPM8JE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/o5Qh61dLDf0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/moco/blob/colab-notebook/colab/moco_cifar10_demo.ipynb) | 20.08.2020 |
  • David Bau - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58452-8_21)](https://doi.org/10.1007/978-3-030-58452-8_21) [![](https://img.shields.io/github/stars/davidbau/rewriting?style=social)](https://github.com/davidbau/rewriting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2007.15646), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.04958)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://rewriting.csail.mit.edu/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=i2_-zNqtEPk), [<img src="images/yt.svg" alt="yt" height=20/>](https://rewriting.csail.mit.edu/video/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/davidbau/rewriting/blob/master/notebooks/rewriting-interface.ipynb) | 31.07.2020 |
  • Vincent Sitzmann - hw7FJOEUK1tX7mdp8SKB368K)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/53c04118df112c13a8c34b38343b9c10-Abstract.html)</li><li>[project](https://vsitzmann.github.io/siren/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Q2fLWGBeaiI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vsitzmann/siren/blob/master/explore_siren.ipynb) | 24.06.2020 |
  • Ryota Natsume - badge.php?doi=10.1109/ICCV.2019.00239)](https://doi.org/10.1109/ICCV.2019.00239) [![](https://img.shields.io/github/stars/shunsukesaito/PIFu?style=social)](https://github.com/shunsukesaito/PIFu) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.05172)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=S1FpjwKqtPs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1GFSsqP2BWz4gtq0e-nki00ZHSirXwFyY) | 17.06.2020 |
  • Meng-Li Shih - Yang Su](https://lemonatsu.github.io/)</li> <li>[Johannes Kopf](https://johanneskopf.de/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00805)](https://doi.org/10.1109/CVPR42600.2020.00805) [![](https://img.shields.io/github/stars/vt-vl-lab/3d-photo-inpainting?style=social)](https://github.com/vt-vl-lab/3d-photo-inpainting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.04727)</li><li>[project](https://shihmengli.github.io/3D-Photo-Inpainting/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1706ToQrkIZshRSJSHvZ1RuCiM__YX3Bz) | 04.05.2020 |
  • Aliaksandr Siarohin - badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520) [![](https://img.shields.io/github/stars/AliaksandrSiarohin/motion-cosegmentation?style=social)](https://github.com/AliaksandrSiarohin/motion-cosegmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/2004.03234)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/video-preprocessing)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=RJ4Nj1wV5iA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/AliaksandrSiarohin/motion-cosegmentation/blob/master/part_swap.ipynb) | 07.04.2020 |
  • Curtis Hawthorne - frames)</li><li>[data](https://g.co/magenta/maestro-wave2midi2wave), [data](https://magenta.tensorflow.org/datasets/e-gmd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/onsets_frames_transcription/onsets_frames_transcription.ipynb) | 02.04.2020 |
  • Tianyi Zhang
  • Ian Simon - transformer)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/piano_transformer/piano_transformer.ipynb) | 16.09.2019 |
  • Jesse Engel
  • Jesse Engel
  • Ian Simon - rnn)</li><li>[data](http://www.piano-e-competition.com/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/performance_rnn/performance_rnn.ipynb) | 11.07.2017 |
  • Jesse Engel - fastgen)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=AaALLWQmCdI), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=BOoSy-Pg8is)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/nsynth/nsynth.ipynb) | 06.04.2017 |
  • Curtis Northcutt - badge.php?doi=10.1613/jair.1.12125)](https://doi.org/10.1613/jair.1.12125) [![](https://img.shields.io/github/stars/cleanlab/cleanlab?style=social)](https://github.com/cleanlab/cleanlab) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.00068)</li><li>[blog post](https://l7.curtisnorthcutt.com/confident-learning)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.cleanlab.ai/)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@sujathamudadla1213/cleanlab-python-library-34e0a37720ef)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://cleanlab.ai/slack)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/CleanlabAI)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BnOTv0f9Msk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nGye-lrsLRc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QHaT_AiUljw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/cleanlab/cleanlab/blob/master/docs/source/tutorials/image.ipynb) | 30.03.2024 |
  • Saran Tunyasuvunakool - tom)</li> <li>[Timothy Lillicrap](https://contrastiveconvergence.net/~timothylillicrap/index.php)</li> <li>[Nicolas Heess](https://scholar.google.com/citations?user=79k7bGEAAAAJ)</li> <li>[Yuval Tassa](https://github.com/yuvaltassa)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/dm_control?style=social)](https://github.com/deepmind/dm_control) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.12983), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.00690), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1902.07151), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1707.02286), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.09564), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.10567)</li><li>[blog post](https://www.deepmind.com/publications/dm-control-software-and-tasks-for-continuous-control)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Tippe_top)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CMjoiU482Jk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rAai4QzcYbs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/WhaRsrlaXLk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm_control/blob/master/tutorial.ipynb) | 28.03.2024 |
  • Emo Todorov - tom)</li> <li>[Yuval Tassa](https://github.com/yuvaltassa)</li></ul> | [![](https://img.shields.io/github/stars/deepmind/mujoco?style=social)](https://github.com/deepmind/mujoco) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.12983)</li><li>[blog post](https://www.deepmind.com/blog/opening-up-a-physics-simulator-for-robotics), [blog post](https://www.deepmind.com/blog/open-sourcing-mujoco)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mujoco.readthedocs.io/en/latest/overview.html)</li><li>[website](https://mujoco.org/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Tippe_top), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Chaos_theory), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/3D_projection#Mathematical_formula)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0ORsj_E17B0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yHZVVfsJ8mc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eyzzsGJ1iic)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm_control/blob/master/dm_control/mujoco/tutorial.ipynb) | 28.03.2024 |
  • Nikhila Ravi - novotny.github.io/)</li> <li>[Taylor Gordon](https://scholar.google.com/citations?user=CNOoeQ0AAAAJ)</li><details><summary>others</summary><li>[Wan-Yen Lo](https://github.com/wanyenlo)</li> <li>[Justin Johnson](https://web.eecs.umich.edu/~justincj/)</li> <li>[Georgia Gkioxari](https://gkioxari.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/pytorch3d?style=social)](https://github.com/facebookresearch/pytorch3d) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2007.08501), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.02739)</li><li>[blog post](https://ai.meta.com/blog/implicitron-a-new-modular-extensible-framework-for-neural-implicit-representations-in-pytorch3d/), [blog post](https://ai.meta.com/blog/-introducing-pytorch3d-an-open-source-library-for-3d-deep-learning/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pytorch3d.readthedocs.org/)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/sohonjit/rendering-with-pytorch3d)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/glimpse-into-pytorch3d-an-open-source-3d-deep-learning-library-291a4beba30f), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@phamtdong0406/crafting-realistic-renderings-with-pytorch3d-947a38194f0a), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/how-to-render-3d-files-using-pytorch3d-ef9de72483f8)</li><li>[website](https://pytorch3d.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0JEb7knenps), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Pph1r-x9nyY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eCDBA_SbxCE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MOBAJb5nJRI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g50RiDnfIfY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hgBk9WlF-XA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Sb9gCCnSAUg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZLqJ33Ey-MU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/pytorch3d/blob/master/docs/tutorials/implicitron_config_system.ipynb) | 28.03.2024 |
  • Edgar Riba - badge.php?doi=10.1109/WACV45572.2020.9093363)](https://doi.org/10.1109/WACV45572.2020.9093363) [![](https://img.shields.io/github/stars/kornia/kornia?style=social)](https://github.com/kornia/kornia) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.02190)</li><li>[blog post](https://opencv.org/kornia-an-open-source-differentiable-computer-vision-library-for-pytorch/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://kornia.readthedocs.io/en/latest/)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://join.slack.com/t/kornia/shared_invite/zt-csobk21g-2AQRi~X9Uu6PLMuUZdvfjA)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/kornia_foss)</li><li>[website](https://kornia.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCI1SE1Ij2Fast5BSKxoa7Ag), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3RmCYFhwclE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AAZa-mXjYF0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kornia/kornia/blob/master/examples/augmentation/kornia_augmentation.ipynb) | 27.03.2024 |
  • intel - us/forums/computer-vision)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/open_model_zoo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Tencent/TNN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/openvino_contrib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/training_extensions), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/model_server), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/opencv/cvat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/datumaro)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/OpenVINO)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@openvino), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/openvino-toolkit)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/OpenVINO)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLg-UKERBljNxdIQir1wrirZJ50yTp4eHv), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Je8n8M0OwxQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ru51DELfc-Q), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5X0RmlH6JI4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hhVRSLbpI5Q), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JH8fsEAIaXo), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLWw98q-Xe7iH06qxEW5a22SBsSNsGnYjZ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openvinotoolkit/openvino_notebooks/blob/main/notebooks/001-hello-world/001-hello-world.ipynb) | 25.03.2024 |
  • Sourab Mangrulkar - tune-flan-t5-peft)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://huggingface.co/docs/peft)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed/issues/3002)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/ought/raft/viewer/twitter_complaints), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/T0_3B), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/mt0-xxl), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/opt-6.7b), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/roberta-large), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/glue/viewer/mrpc)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YVU5wAA6Txo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Us5ZFp16PaU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YKCtbIJC3kQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/peft/blob/master/examples/int8_training/Finetune_flan_t5_large_bnb_peft.ipynb) | 21.03.2024 |
  • Zachary Charles - analytics-collaborative-data.html)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/federated-learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/learning/Model)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/building_your_own_federated_learning_algorithm.ipynb) | 20.03.2024 |
  • Krzysztof Ostrowski - special-database-19)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/tensorflow/standardizing-on-keras-guidance-on-high-level-apis-in-tensorflow-2-0-bad2b04c819a)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/federated-learning), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/image-classification)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/federated_learning_for_image_classification.ipynb) | 20.03.2024 |
  • Krzysztof Ostrowski
  • Krzysztof Ostrowski - learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/federated_core), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/federated_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/custom_federated_algorithms_1.ipynb) | 20.03.2024 |
  • Krzysztof Ostrowski - learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/federated_core), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/federated_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/custom_federated_algorithms_2.ipynb) | 20.03.2024 |
  • Weikang Song - learning)</li><li>[tensor encoding](http://jakubkonecny.com/files/tensor_encoding.pdf)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/simulation/datasets/emnist), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/learning/build_federated_averaging_process)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/tff_for_federated_learning_research_compression.ipynb) | 20.03.2024 |
  • Krzysztof Ostrowski - learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/simulations.ipynb) | 20.03.2024 |
  • intel - extension-for-transformers?style=social)](https://github.com/intel/intel-extension-for-transformers) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.17453), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.00502), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.07715), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2210.17114), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.05754)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/Wxk3J3ZJkU)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://intel.github.io/intel-extension-for-transformers/latest/docs/Welcome.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/ggml), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TimDettmers/bitsandbytes), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IntelLabs/fastRAG), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IST-DASLab/gptq), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mit-han-lab/streaming-llm)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/assisted-generation), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Intel/neural-chat-7b-v3-1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/Andyrasika/neural-chat-intel)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@NeuralCompressor/creating-your-own-llms-on-your-laptop-a08cc4f7c91b), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@NeuralCompressor/the-practice-of-supervised-finetuning-and-direct-preference-optimization-on-habana-gaudi2-a1197d8a3cd3), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@NeuralCompressor/llm-performance-of-intel-extension-for-transformers-f7d061556176), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@NeuralCompressor/high-performance-low-bit-layer-wise-weight-only-quantization-on-a-laptop-712580899396), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/intel-analytics-software/reduce-large-language-model-carbon-footprint-with-intel-neural-compressor-and-intel-extension-for-dfadec3af76a)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bWhZ1u_1rlc), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=RbKRELWP9y8&t=2954s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7_urstS-noU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bWhZ1u_1rlc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KWT6yKfu4n0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/intel/intel-extension-for-transformers/blob/main/docs/tutorials/pytorch/text-classification/SetFit_model_compression_AGNews.ipynb) | 19.03.2024 |
  • EnzymeZoo - art/deforum-stable-diffusion?style=social)](https://github.com/deforum-art/deforum-stable-diffusion) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/deforum)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.google.com/document/d/1RrQv7FntzOuLg4ohjRZPVL7iptIyBhwwbcEYEW2OfcI)</li><li>[project](https://deforum.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w_sxuDMt_V0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bicPayZDI60), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dqkQo2alZvU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deforum-art/deforum-stable-diffusion/blob/main/Deforum_Stable_Diffusion.ipynb) | 14.03.2024 |
  • microsoft - us/research/blog/autogen-enabling-next-generation-large-language-model-applications/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/pAbnFJrkgZ)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@multiplatform.ai/microsoft-autogen-transforming-ai-frameworks-for-enhanced-problem-solving-video-ac2655e7cdf)</li><li>[project](https://microsoft.github.io/autogen/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zdcCD--IieY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dCCr52uT0W8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JMpgsx74XDI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_RetrieveChat.ipynb) | 14.03.2024 |
  • Roboflow - on-a-dataset-with-a-roboflow-model)</li><li>[website](https://supervision.roboflow.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uWP6UjDeZvY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4Q3ut7vqD5o), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtube.com/roboflow)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow/supervision/blob/main/demo.ipynb) | 13.03.2024 |
  • François Chollet - learning)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Transfer_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/transfer_learning.ipynb) | 12.03.2024 |
  • Alex Wiltschko - 467105048) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1406.2572), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.04454), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.03451), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.07062)</li><li>[book](https://mitpress.mit.edu/sites/default/files/titles/content/sicm_edition_2/book.html), [book](https://mitpress.mit.edu/books/functional-differential-geometry)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/jax#auto-vectorization-with-vmap), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hips/autograd)</li><li>[tutorial](http://videolectures.net/deeplearning2017_johnson_automatic_differentiation/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Truncated_Newton_method), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pullback_(differential_geometry)), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Holomorphic_function), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Cauchy%E2%80%93Riemann_equations)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/jax/blob/main/docs/notebooks/autodiff_cookbook.ipynb) | 07.03.2024 |
  • Glenn Jocher - net.org/)</li><li>[blog post](https://habr.com/ru/articles/710016/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://ultralytics.com/discord)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/ultralytics/ultralytics)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.ultralytics.com/)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/ultralytics/yolov8)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/ultralytics)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtube.com/ultralytics), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/m9fH9OWn8YM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wuZtUMEiKWY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gRAyOPjQ9_s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fhzCwJkDONE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IHbJcOex6dk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ultralytics/ultralytics/blob/main/examples/tutorial.ipynb) | 04.03.2024 |
  • Thomas Simonini - rl-class?style=social)](https://github.com/huggingface/deep-rl-class) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/alex-petrenko/sample-factory)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/deep-rl-course/unit0/introduction), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/huggingface-projects/Deep-Reinforcement-Learning-Leaderboard)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html)</li><li>[syllabus](https://simoninithomas.github.io/deep-rl-course)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2GwBez0D20A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CsuIANBnSq8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AQKAOXJa6qg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/deep-rl-class/blob/main/notebooks/unit1/unit1.ipynb) | 02.03.2024 |
  • Samet Akcay - intel)</li> <li>[Utku Genc](https://github.com/ugenc-intel)</li></ul></details> | [![](https://img.shields.io/github/stars/openvinotoolkit/anomalib?style=social)](https://github.com/openvinotoolkit/anomalib) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.08785)</li><li>[data](https://www.mvtec.com/company/research/datasets/mvtec-ad)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://openvinotoolkit.github.io/anomalib/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vnk8071/anomaly-detection-in-industry-manufacturing/tree/master/anomalib_contribute)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/getting-started-with-pytorch-image-models-timm-a-practitioners-guide-4e77b4bf9055)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/lib/timm)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openvinotoolkit/anomalib/blob/main/notebooks/000_getting_started/001_getting_started.ipynb) | 01.03.2024 |
  • Matthew Tancik - austin)</li> <li>[Kamyar Salahi](https://github.com/TheQuantumFractal)</li> <li>[Abhik Ahuja](https://abhikahuja.com/)</li> <li>[David McAllister](https://github.com/mcallisterdavid)</li> <li>[Angjoo Kanazawa](https://github.com/akanazawa)</li></ul></details> | [![](https://img.shields.io/github/stars/nerfstudio-project/nerfstudio?style=social)](https://github.com/nerfstudio-project/nerfstudio) <ul><li>[Viewer](https://viewer.nerf.studio/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.04264)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/uMbNqcraFc)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.nerf.studio/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/tiny-cuda-nn)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/nerfstudioteam)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XwKq7qDQCQk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nSFsugarWzk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/h5EWiRRxYEQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8cv9G7izdPY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nerfstudio-project/nerfstudio/blob/main/colab/demo.ipynb) | 01.03.2024 |
  • Willem Pienaar - dev/feast?style=social)](https://github.com/feast-dev/feast) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.feast.dev/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/baineng/feast-hive), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Shopify/feast-trino), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Azure/feast-azure), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/amundsen-io/amundsen/blob/main/databuilder/databuilder/extractor/feast_extractor.py)</li><li>[website](https://feast.dev/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/DaNv-Wf1MBA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/p2cuq4eJ2BY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/feast-dev/feast/blob/master/examples/quickstart/quickstart.ipynb) | 28.02.2024 |
  • Brian Moore - examples)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/voxel51), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/open-source-tools-for-fast-computer-vision-model-building-b39755aab490)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://slack.voxel51.com/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/voxel51)</li><li>[website](https://voxel51.com/fiftyone/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLuREAXoPgT0SJLKsgFzKxffMApbXp90Gi)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/voxel51/fiftyone-examples/blob/master/examples/quickstart.ipynb) | 27.02.2024 |
  • MetaVoice - src?style=social)](https://github.com/metavoiceio/metavoice-src) <ul><li>[demo](https://ttsdemo.themetavoice.xyz/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/metavoiceio)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/MetaVoiceAI)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Y_k3bHPcPTo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gVKbf31hrYs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1UmjE1mzfG4td0rCjJEaAWGQXpn_GuwwY) | 26.02.2024 |
  • w-okada - okada/voice-changer?style=social)](https://github.com/w-okada/voice-changer) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/yxlllc/DDSP-SVC)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/wok000/vcclient000)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/POo_Cg0eFMU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fba9Zhsukqw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/s_GirFEGvaA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Q7bbEC4aeKM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_JXbvSTGPoo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/pHhjg2JwdPI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/We5oYpCR3WQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/aVfoC1EHlVs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YF1lBaqeyt8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hinabl/voice-changer-colab/blob/master/Hina_Modified_Realtime_Voice_Changer_on_Colab.ipynb) | 26.02.2024 |
  • microsoft - development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://aka.ms/genai-discord)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Web-Dev-For-Beginners)</li><li>[project](https://microsoft.github.io/generative-ai-for-beginners/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/generative-ai-for-beginners/blob/main/06-text-generation-apps/notebook-azure-openai.ipynb) | 22.02.2024 |
  • Omry Yadan - omegaconf-a33be1b748ab)</li><li>[slides](https://docs.google.com/presentation/d/e/2PACX-1vT_UIV7hCnquIbLUm4NnkUpXvPEh33IKiUEvPRF850WKA8opOlZOszjKdZ3tPmf8u7hGNP6HpqS-NT5/pub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/omry/omegaconf/blob/master/docs/notebook/Tutorial.ipynb) | 15.02.2024 |
  • Takuya Akiba - votte)</li> <li>[Toshihiko Yanase](https://github.com/toshihikoyanase)</li> <li>[Takeru Ohta](https://github.com/sile)</li> <li>[Masanori Koyama](https://scholar.google.com/citations?user=oY1gA10AAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/optuna/optuna?style=social)](https://github.com/optuna/optuna) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10902)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/optuna/optuna)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://optuna.readthedocs.io/en/stable/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/optuna/optuna-dashboard)</li><li>[website](https://optuna.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J_aymk4YXhg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tcrcLRopTX0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-UeC4MR3PHM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oC8zFYcfYXU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/quickstart.ipynb) | 15.02.2024 |
  • Billy Lamberta - augmentation)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/tf_flowers)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Data_augmentation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/data_augmentation.ipynb) | 14.02.2024 |
  • Stability AI - AI/StableCascade?style=social)](https://github.com/Stability-AI/StableCascade) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.00637)</li><li>[blog post](https://stability.ai/news/introducing-stable-cascade)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/stablediffusion)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-cascade), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/nateraw/parti-prompts)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/intelligent-art/stable-cascade-a-super-easy-local-installation-guide-ce0cbd06d800), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@yushantripleseven/stable-cascade-training-inference-a52e12ecc5fa)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/stabilityai)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ybu6qTbEsewc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JuX-uukwdkI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YMxXtaiVHks), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/UgM-z2q3Xe0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/W6YLIyA3Kco), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X1rLWFRagIw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mkshing/notebooks/blob/main/stable_cascade.ipynb) | 14.02.2024 |
  • cleanlab - examples)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://cleanlab.ai/slack)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/CleanlabAI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/cleanlab/cleanvision/blob/main/docs/source/tutorials/tutorial.ipynb) | 13.02.2024 |
  • Jinbo Xing - Tsin Wong](https://ttwong12.github.io/myself.html)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.12190)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/chaojie/ComfyUI-DynamiCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/VideoCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/YingqingHe/ScaleCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/TaleCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/FreeNoise)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Doubiiu/DynamiCrafter_1024)</li><li>[project](https://doubiiu.github.io/projects/DynamiCrafter/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1aj7gcw/dynamicrafter_gets_updated/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/noguchis/status/1754488826016432341?s=20)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0NfmIsNAg-g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PtW7hjCawbo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DynamiCrafter-colab/blob/main/DynamiCrafter_colab_576_1024.ipynb) | 12.02.2024 |
  • Artiprocher - Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2401.16224)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Helsinki-NLP/opus-mt-en-zh), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/alibaba-pai/pai-bloom-1b1-text2prompt-sd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Artiprocher/DiffSynth-Studio/blob/main/examples/Diffutoon.ipynb) | 05.02.2024 |
  • The Mosaic ML Team - best-practices-for-efficient-model-training)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](http://docs.mosaicml.com/)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/mosaicml)</li><li>[website](https://www.mosaicml.com/composer)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Amdahl's_law)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/@mosaicml6047/videos), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/n-1WV5QdIDc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Xi_5wq2MpOw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mosaicml/composer/blob/dev/examples/getting_started.ipynb) | 01.02.2024 |
  • Daniel Freeman
  • autodistill - grounded-sam), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolov8), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolonas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolov5), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-detr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-detic), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-grounding-dino), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-owl-vit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-sam-clip), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-llava), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-kosmos-2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-owlv2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-roboflow-universe), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-azure-vision), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-rekognition), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-gcp-vision), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/roboflow/inference)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gKTYMfwPo4M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/M_QZ_Q0zT0k), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtube.com/roboflow)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/how-to-auto-train-yolov8-model-with-autodistill.ipynb) | 31.01.2024 |
  • Billy Lamberta
  • Billy Lamberta
  • Google - data-analyst?style=social)](https://github.com/GoogleCloudPlatform/training-data-analyst/tree/master/blogs/integrated_gradients) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.01365)</li><li>[visualizing](https://distill.pub/2020/attribution-baselines/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Explainable_artificial_intelligence), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Linear_interpolation), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Riemann_sum)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/interpretability/integrated_gradients.ipynb) | 17.01.2024 |
  • Alon Ziv - Diffusion/blob/main/Tutorials/AI-Music-Generation-Audiocraft-Tutorial.md#more-info-about-top-k-top-p-temperature-and-classifier-free-guidance-from-chatgpt)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/magnet-medium-10secs), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/magnet-medium-30secs), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/audio-magnet-medium)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://generativeai.pub/metas-ai-magnet-the-next-big-thing-in-text-to-audio-technology-7d524d9459ef)</li><li>[project](https://pages.cs.huji.ac.il/adiyoss-lab/MAGNeT/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/ArtificialInteligence/comments/19808gf/magnet_masked_audio_generation_using_a_single/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/MAGNeT-colab/blob/main/MAGNET_colab.ipynb) | 16.01.2024 |
  • RVC-Project - Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/HcsmBBGyVk)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/auspicious3000/contentvec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jik876/hifi-gan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FFmpeg/FFmpeg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Anjok07/ultimatevocalremovergui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/audio-slicer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dream-High/RMVPE)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/lj1995/VoiceConversionWebUI)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@ja.harr91/decoding-the-sound-of-virality-a-deep-dive-into-adversarial-ai-for-voice-conversion-tasks-on-m1-d60d32cfb2d4)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-JcvdDErkAU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9TroP5mR3CM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Y8IxVVQBEpc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qZ12-Vm2ryc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5i_Pyw0gH-M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/RVC-Project/Retrieval-based-Voice-Conversion-WebUI/blob/main/Retrieval_based_Voice_Conversion_WebUI.ipynb) | 11.01.2024 |
  • Jonathan Heek - van-zee/)</li></ul></details> | [![](https://img.shields.io/github/stars/google/flax?style=social)](https://github.com/google/flax) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://flax.readthedocs.io/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://github.com/huggingface/transformers/tree/main/examples/flax)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/google-introduces-flax-a-neural-network-library-for-jax-84bdc6f8f160)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/erpdf7/p_flax_a_neural_network_library_for_jax_designed/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/e8StU6WQCqw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HOlQzrn84A4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5eUSmJvK8WA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/flax/blob/main/docs/quick_start.ipynb) | 10.01.2024 |
  • Lucas Beyer - research/big_vision?style=social)](https://github.com/google-research/big_vision) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11929), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.04560), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.01580), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.08013), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13035), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.17376), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.07915), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.16999), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.08242), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.07159)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/data), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/image_text/lit.ipynb) | 03.01.2024 |
  • Killian Lucas - interpreter?style=social)](https://github.com/KillianLucas/open-interpreter) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/6p3fD6rBVm)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.openinterpreter.com/)</li><li>[website](https://openinterpreter.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SqnXUHwIa3c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/s-f4lCETxu0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J-H2un1Adr0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jaijpff58vw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7KFbG_3dKKs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4OhuFjPyZNQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/01tQLn_RRcE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uyfoHQVgeY0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1WKmRXZgsErej2xUriKzxrEAXdxMSgWbb) | 03.01.2024 |
  • Edouard Leurent - env?style=social)](https://github.com/eleurent/highway-env) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.03483), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.05701), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.07140)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://highway-env.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eleurent/rl-agents), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/eleurent/finite-mdp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/baselines/tree/master/baselines/her)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eleurent/highway-env/blob/master/scripts/parking_model_based.ipynb) | 03.01.2024 |
  • Loïc Barrault - An Chung](https://iamyuanchung.github.io/)</li> <li>[Mariano Coria](https://www.linkedin.com/in/marianocoria)</li> <li>[David Dale](https://daviddale.ru/)</li><details><summary>others</summary><li>[Ning Dong](https://scholar.google.com/citations?user=gg1hvjoAAAAJ)</li> <li>[Mark Duppenthaler](https://github.com/mduppes)</li> <li>[Paul-Ambroise Duquenne](https://scholar.google.com/citations?user=Uah8IcAAAAAJ)</li> <li>[Hady Elsahar](https://www.hadyelsahar.io/)</li> <li>[Min-Jae Hwang](https://mjhwang93.github.io/)</li> <li>[Hirofumi Inaguma](https://hirofumi0810.github.io/)</li> <li>[Ilia Kulikov](https://github.com/uralik)</li> <li>[Pengwei Li](https://scholar.google.com/citations?user=hQB3YsYAAAAJ)</li> <li>[Daniel Licht](https://github.com/Lichtphyz)</li> <li>[Jean Maillard](https://scholar.google.com/citations?user=_ewOoK0AAAAJ)</li> <li>[Ruslan Mavlyutov](https://github.com/mavlyutovr)</li> <li>[Kaushik Ram Sadagopan](https://github.com/kauterry)</li> <li>[Abinesh Ramakrishnan](https://github.com/ibanesh)</li> <li>[Tuan Tran](https://antoine-tran.github.io/)</li> <li>[Guillaume Wenzek](https://github.com/gwenzek)</li> <li>[Yilin Yang](https://yilinyang7.github.io/)</li> <li>[Ethan Ye](https://github.com/yeyinthtoon)</li> <li>[Ivan Evtimov](https://ivanevtimov.eu/)</li> <li>[Pierre Fernandez](https://pierrefdz.github.io/)</li> <li>[Robin San Roman](https://scholar.google.com/citations?user=AJ3ir84AAAAJ)</li> <li>[Bokai Yu](https://scholar.google.com/citations?user=7jNmPwUAAAAJ)</li> <li>[Pierre Andrews](https://github.com/Mortimerp9)</li> <li>[Can Balioglu](http://canbalioglu.com/)</li> <li>[Peng-Jen Chen](https://scholar.google.com/citations?user=rOXs9VMAAAAJ)</li> <li>[Marta Costa-jussà](https://costa-jussa.com/)</li> <li>[Maha Elbayad](http://elbayadm.github.io/)</li> <li>[Hongyu Gong](https://github.com/hygong-fb)</li> <li>[Francisco Guzmán](https://guzmanhe.github.io/)</li> <li>[Kevin Heffernan](https://github.com/heffernankevin)</li> <li>[Somya Jain](https://scholar.google.com/citations?user=AmBxU3kAAAAJ)</li> <li>[Justine Kao](https://scholar.google.com/citations?user=Y9BLeTAAAAAJ)</li> <li>[Ann Lee](https://www.stat.cmu.edu/~annlee/)</li> <li>[Xutai Ma](https://github.com/xutaima)</li> <li>[Benjamin Peloquin](https://scholar.google.com/citations?user=5GNAjB8AAAAJ)</li> <li>[Juan Pino](https://scholar.google.com/citations?user=weU_-4IAAAAJ)</li> <li>[Sravya Popuri](https://scholar.google.com/citations?user=MtmqG3UAAAAJ)</li> <li>[Holger Schwenk](https://github.com/hoschwenk)</li> <li>[Anna Sun](https://github.com/annasun28)</li> <li>[Paden Tomasello](https://scholar.google.com/citations?user=sBtWMGYAAAAJ)</li> <li>[Changhan Wang](https://www.changhan.me/)</li> <li>[Skyler Wang](https://www.skylerwang.com/)</li> <li>[Mary Williamson](https://scholar.google.com/citations?user=Ys4xB-QAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/seamless_communication?style=social)](https://github.com/facebookresearch/seamless_communication) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2312.05187)</li><li>[blog post](https://ai.meta.com/research/seamless-communication/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/libsndfile/libsndfile), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairseq2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SimulEval), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/stopes), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SONAR)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-m4t-v2-large), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-expressive), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-streaming)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://ngwaifoong92.medium.com/beginners-guide-to-seamlessm4t-81efad6e8ca6)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=0padjtkHXTE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rNN7qsoCKBo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RKEFZ44YOcc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/seamless_communication/blob/main/Seamless_Tutorial.ipynb) | 14.12.2023 |
  • Drengskapur
  • Nils Reimers - darmstadt.de/ukp/ukp_home/head_ukp/index.en.jsp)</li></ul> | [![](https://img.shields.io/github/stars/UKPLab/sentence-transformers?style=social)](https://github.com/UKPLab/sentence-transformers) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1908.10084), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.09813), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.08240)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://www.sbert.net/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/UKPLab/sentence-transformers/blob/master/examples/applications/retrieve_rerank/retrieve_rerank_simple_wikipedia.ipynb) | 07.12.2023 |
  • EleutherAI - evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.14165)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/eleutherai)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AutoGPTQ/AutoGPTQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/EleutherAI/gpt-neox), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Megatron-DeepSpeed), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vllm-project/vllm)</li><li>[project](https://www.eleuther.ai/projects/large-language-model-evaluation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/EleutherAI/lm-evaluation-harness/blob/main/examples/lm-eval-overview.ipynb) | 30.11.2023 |
  • Shengyi Huang - ai/CORL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/Gymnasium), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/baselines), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ikostrikov/jaxrl)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cleanrl)</li><li>[paper](https://www.jmlr.org/papers/v23/21-1342.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCDdC6BIFRI0jvcwuhi3aI6w), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dm4HdGujpPs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vwxyzjn/cleanrl/blob/master/docs/get-started/CleanRL_Huggingface_Integration_Demo.ipynb) | 28.11.2023 |
  • Glenn Jocher
  • Sanchit Gandhi - whisper?style=social)](https://github.com/huggingface/distil-whisper) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.00430), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.17192)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/safetensors), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/collections/distil-whisper/training-datasets-6538d05c69721489d1db1e49), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoModelForSpeechSeq2Seq), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoProcessor), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/whisper#transformers.WhisperForConditionalGeneration.forward.example), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.GenerationMixin.generate.assistant_model), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#flashattention-2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#bettertransformer)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/prompt-engineering/transcribing-audio-with-python-and-distil-whisper-9b4fec3d53bf)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/17vqtcb/p_distilwhisper_a_distilled_variant_of_whisper/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/46Q6fbdUCbg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SZtHEKyvuug), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/kI1pA1CADxM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sanchit-gandhi/notebooks/blob/main/Distil_Whisper_Benchmark.ipynb) | 08.11.2023 |
  • Shishir Patil - llm/gorilla-cli)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/latinxinai/try-gorilla-a-large-language-model-connected-with-massive-apis-442f3b554ffb)</li><li>[project](http://gorilla.cs.berkeley.edu/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4EdyWkcddPc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RMgM3tPTpXI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CX1Kzijq2TI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8AqQBPI4CFI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iQwYoii4YiI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/alDArqcxSvw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/EypdTAlmoo4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LkV5DTRNxAg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP) | 07.11.2023 |
  • Yuwei Guo - revolution/sd-webui-animatediff), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/talesofai/AnimateDiff), [<img src="images/git.svg" alt="git" height=20/>](https://youtu.be/-wki7IrQ_sU)</li><li>[project](https://animatediff.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rdnOhM8L8nE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LcHAZaJjA5k), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/66JgpI3a650?feature=share)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/AnimateDiff-colab/blob/main/AnimateDiff_colab.ipynb) | 30.10.2023 |
  • intel - compressor?style=social)](https://github.com/intel/neural-compressor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.14592), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.05516), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.07715)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.com/invite/Wxk3J3ZJkU)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://github.com/intel/neural-compressor)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/intel/intel-extension-for-tensorflow), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/intel/intel-extension-for-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lightning-AI/pytorch-lightning/blob/master/docs/source-pytorch/advanced/post_training_quantization.rst)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/pytorch/pytorch-inference-acceleration-with-intel-neural-compressor-842ef4210d7d), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/intel-analytics-software/efficient-text-classification-with-intel-neural-compressor-4853296deeac)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://neurips.cc/virtual/2022/59433)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/recipes/intel_neural_compressor_for_pytorch.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SswQbIHUrvQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5xHKe4wWLes), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/H7Gg-EmGpAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ie3w_j0Ntsk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/m2LokuUdeVg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/38wrDHEQZuM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/intel/neural-compressor/blob/master/examples/notebook/onnxruntime/Quick_Started_Notebook_of_INC_for_ONNXRuntime.ipynb) | 27.10.2023 |
  • suno - ai/bark?style=social)](https://github.com/suno-ai/bark) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.03143), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/J2B2vsjKuE)</li><li>[examples](https://suno-ai.notion.site/Bark-Examples-5edae8b02a604b54a42244ba45ebc2e2)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/karpathy/nanoGPT)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/huggingface_hub/package_reference/environment_variables#hfhome)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/OnusFM)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/84LzaXAo6vE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rU5Do9yHbwM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w41-MUfxIWo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_m-MxEpHUQY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dWWkZzvu7L9Bunq9zvD-W02RFUXoW-Pd) | 25.10.2023 |
  • comfyanonymous
  • Nikita Martynov - kozlova)</li> <li>[Katerina Kolomeytseva](https://www.linkedin.com/in/katerina-kolomeytseva-394a7a21a)</li><details><summary>others</summary><li>[Aleksandr Abramov](https://github.com/Ab1992ao)</li> <li>[Alena Fenogenova](https://github.com/Alenush)</li></ul></details> | [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.09435)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ai-forever/augmentex)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/RuM2M100-1.2B), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/FRED-T5-large-spell), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/RuM2M100-418M), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/T5-large-spell), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/ai-forever/spellcheck_benchmark)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Levenshtein_distance)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yFfkV0Qjuu0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/sage/blob/main/notebooks/text_correction_demo.ipynb) | 11.10.2023 |
  • Albert Jiang - maria-lengyel)</li> <li>[Guillaume Lample](https://github.com/glample)</li> <li>[Lucile Saulnier](https://scholar.google.com/citations?user=Baj_9IsAAAAJ)</li> <li>[Lélio Renard Lavaud](https://github.com/lerela)</li> <li>[Marie-Anne Lachaux](https://scholar.google.com/citations?user=dSEMIJ8AAAAJ)</li> <li>[Pierre Stock](https://github.com/pierrestock)</li> <li>[Teven Scao](https://scholar.google.com/citations?user=ik0_vxsAAAAJ)</li> <li>[Thibaut Lavril](https://scholar.google.com/citations?user=9nPunCEAAAAJ)</li> <li>[Thomas Wang](https://github.com/thomasw21)</li> <li>[Timothée Lacroix](https://scholar.google.com/citations?&user=tZGS6dIAAAAJ)</li> <li>[William Sayed](https://www.linkedin.com/in/william-el-sayed-48672312a)</li></ul></details> | [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.06825), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.10509), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.05150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.05685)</li><li>[blog post](https://mistral.ai/news/announcing-mistral-7b/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.com/invite/mistralai)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.mistral.ai/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/vllm-project/vllm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/ggml), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/skypilot-org/skypilot)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/mistralai)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/mistral-7b-recipes-for-fine-tuning-and-quantization-on-your-computer-631401583f77)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g7kVVBlCGo0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ASpageg8nPw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OMIuP6lQXe4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jnPZApwtE4I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3SdopNwQJ-c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Mistral-colab/blob/main/Mistral_colab.ipynb) | 09.10.2023 |
  • Lvmin Zhang - Q), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TJkrzuPdmvE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NfNwmKM3sxc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/lllyasviel/Fooocus/blob/main/colab.ipynb) | 03.10.2023 |
  • Mark Daoust - actor-critic-algorithms.pdf), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/1713-policy-gradient-methods-for-reinforcement-learning-with-function-approximation.pdf)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Temporal_difference_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/reinforcement_learning/actor_critic.ipynb) | 27.09.2023 |
  • Google - signal-processing/stft-2-tjEQe)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/speech-recognition)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/speech_commands)</li><li>[tf.js](https://codelabs.developers.google.com/codelabs/tensorflowjs-audio-codelab/index.html#0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/audio/simple_audio.ipynb) | 27.09.2023 |
  • OpenMMLab - mmlab/mmagic?style=social)](https://github.com/open-mmlab/mmagic) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/raweFPmdzG)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mmagic.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmgeneration), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmengine/blob/main/mmengine/model/wrappers/seperate_distributed.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmcv), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mim)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://openmmlab.medium.com/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/OpenMMLab)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/openmmlab)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmagic/blob/main/demo/mmagic_inference_tutorial.ipynb) | 11.09.2023 |
  • Glenn Jocher
  • MMAction2 Contributors - mmlab/mmaction2?style=social)](https://github.com/open-mmlab/mmaction2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.13230), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.10161), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17263), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.13586), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.05095), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.13042)</li><li>[data](https://sdolivia.github.io/FineGym/), [data](http://www.svcl.ucsd.edu/projects/resound/dataset.html), [data](https://research.google.com/ava/index.html), [data](https://www.deepmind.com/open-source/kinetics)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mmaction2.readthedocs.io/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmcv), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Video-Swin-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Cogito2012/DEAR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xvjiarui/VFS), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/holistic-video-understanding/HVU-Dataset)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmaction2/blob/master/demo/mmaction2_tutorial.ipynb) | 06.09.2023 |
  • Philipp Moritz - wang.github.io/)</li> <li>[Alexey Tumanov](https://faculty.cc.gatech.edu/~atumanov/)</li><details><summary>others</summary><li>[Richard Liaw](https://github.com/richardliaw)</li> <li>[Eric Liang](https://github.com/ericl)</li> <li>[Melih Elibol](https://research.nvidia.com/person/melih-elibol)</li> <li>[Zongheng Yang](https://zongheng.me/)</li> <li>[William Paul](https://github.com/Wapaul1)</li> <li>[Michael Jordan](https://people.eecs.berkeley.edu/~jordan/)</li> <li>[Ion Stoica](https://people.eecs.berkeley.edu/~istoica/)</li></ul></details> | [![](https://img.shields.io/github/stars/ray-project/ray?style=social)](https://github.com/ray-project/ray) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.05889), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.05072), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.09381), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.05118), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.03924)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.ray.io/en/latest/index.html)</li><li>[website](https://www.ray.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LmROEotKhJA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uzt-CwohQC8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XME90SGL6Vs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ray-project/ray/blob/master/doc/source/tune/examples/optuna_example.ipynb) | 06.09.2023 |
  • Billy Lamberta
  • Billy Lamberta - classification)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/classification.ipynb) | 31.08.2023 |
  • Chris Paxton - robot?style=social)](https://github.com/facebookresearch/home-robot) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cpaxton/contact_graspnet/tree/cpaxton/devel), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_body), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_firmware), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_ros), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_ros2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_web_interface), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RoboStack/ros-noetic), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/codekansas/stretch-robot)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/home-robot/blob/master/src/home_robot_sim/notebooks/velocity_control_sim.ipynb) | 30.08.2023 |
  • Robin Rombach - qp)</li> <li>[Patrick Esser](https://github.com/pesser)</li><details><summary>others</summary><li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li> <li>[qunash](https://github.com/qunash)</li></ul></details> | [![](https://img.shields.io/github/stars/Stability-AI/stablediffusion?style=social)](https://github.com/Stability-AI/stablediffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10752), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.00512), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.02502), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.01073), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09778), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.00927)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/qunash/stable-diffusion-2-gui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/isl-org/MiDaS), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/denoising-diffusion-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/runwayml/stable-diffusion/blob/main/scripts/inpaint_st.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/crowsonkb/k-diffusion)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1-base), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-depth), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-inpainting)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HytucGhwTRs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/qunash/stable-diffusion-2-gui/blob/main/stable_diffusion_2_0.ipynb) | 26.08.2023 |
  • Boris Dayma - suraj)</li> <li>[Pedro Cuenca](https://github.com/pcuenca)</li> <li>[Khalid Saifullah](https://khalidsaifullaah.github.io/)</li><details><summary>others</summary><li>[Tanishq Abraham](https://github.com/tmabraham)</li> <li>[Phúc H. Lê Khắc](https://lkhphuc.com/)</li> <li>[Luke Melas](https://lukemelas.github.io/)</li> <li>[Ritobrata Ghosh](https://ghosh-r.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/borisdayma/dalle-mini?style=social)](https://github.com/borisdayma/dalle-mini) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.08981), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.13461), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.04015)</li><li>[blog post](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA)</li><li>[data](https://aclanthology.org/P18-1238/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/transformers/tree/master/examples/research_projects/jax-projects), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/CLIP/blob/main/data/yfcc100m.md)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/flax-community/dalle-mini)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/borisdayma/dalle-mini/blob/main/tools/inference/inference_pipeline.ipynb) | 22.08.2023 |
  • Google - autoencoders-in-keras.html)</li><li>[book](https://www.deeplearningbook.org/contents/autoencoders.html)</li><li>[data](http://www.timeseriesclassification.com/description.php?Dataset=ECG5000)</li><li>[examples](https://anomagram.fastforwardlabs.com/#/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/autoencoder)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/generative/autoencoder.ipynb) | 14.08.2023 |
  • Benjamin Lefaudeux - caggiano.github.io/)</li> <li>[Sean Naren](https://github.com/SeanNaren)</li> <li>[Min Xu](https://github.com/min-xu-ai)</li> <li>[Jieru Hu](https://github.com/jieru-hu)</li> <li>[Marta Tintore](https://github.com/MartaTintore)</li> <li>[Susan Zhang](https://suchenzang.github.io/)</li> <li>[Patrick Labatut](https://github.com/patricklabatut)</li> <li>[Daniel Haziza](https://scholar.google.com/citations?user=2eSKdFMAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/xformers?style=social)](https://github.com/facebookresearch/xformers) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/xformers/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/sputnik), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hgyhungry/ge-spmm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/triton), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RobinBruegger/RevTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mlpen/Nystromformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NJyZCdxnGe4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/xformers/blob/main/docs/source/xformers_mingpt.ipynb) | 11.08.2023 |
  • Google - badge.php?doi=10.18653/v1/N19-1423)](https://doi.org/10.18653/v1/N19-1423) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.04805), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1711.05101)</li><li>[data](https://ai.stanford.edu/~amaas/data/sentiment/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/text-classification)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://tfhub.dev/google/collections/bert/1)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/text/blob/master/docs/tutorials/classify_text_with_bert.ipynb) | 08.08.2023 |
  • Arseniy Shakhmatov - forever/Kandinsky-2?style=social)](https://github.com/ai-forever/Kandinsky-2) <ul><li>[blog post](https://habr.com/ru/companies/sberbank/articles/725282/)</li><li>[demo](https://editor.fusionbrain.ai/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sberbank-ai/Kandinsky_2.1)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LZvp4SWcCao), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IoPhRE37XSU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dYt9xJ7dnpU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rN2J5TL2RZ0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1xSbu-b-EwYd6GdaFPRVgvXBX_mciZ41e) | 07.08.2023 |
  • svc develop team - develop-team/so-vits-svc?style=social)](https://github.com/svc-develop-team/so-vits-svc) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NaruseMioShirakana/MoeVoiceStudio), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/DiffSinger/tree/refactor/modules/nsf_hifigan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/auspicious3000/contentvec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/yxlllc/DDSP-SVC), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/flutydeer/audio-slicer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/audio-slicer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/NaruseMioShirakana/MoeSS-SUBModel/tree/main)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/svc-develop-team/so-vits-svc/blob/4.1-Stable/sovits4_for_colab.ipynb) | 31.07.2023 |
  • Yuan-Chen Guo - Tian Liu](https://github.com/thuliu-yt16)</li> <li>[Ruizhi Shao](https://github.com/DSaurus)</li> <li>[Christian Laforte](https://github.com/claforte)</li><details><summary>others</summary><li>[Vikram Voleti](https://github.com/voletiv)</li> <li>[Guan Luo](https://github.com/logan0601)</li> <li>[Chia-Hao Chen](https://scholar.google.com/citations?user=X0zirvMAAAAJ)</li> <li>[Zi-Xin Zou](https://github.com/zouzx)</li> <li>[Chen Wang](https://cwchenwang.github.io/)</li> <li>[Yanpei Cao](https://yanpei.me/)</li> <li>[Song-Hai Zhang](https://scholar.google.com/citations?user=AWtV-EQAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/threestudio-project/threestudio?style=social)](https://github.com/threestudio-project/threestudio) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.15413), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.16213), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.10440)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/ejer2MAB8N)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DSaurus/Tensor4D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/latent-nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Gorilla-Lab-SCUT/Fantasia3D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cvlab-columbia/zero123), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/guochengqian/Magic123), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ayaanzhaque/instruct-nerf2nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/KAIR-BAIR/nerfacc), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lightning-AI/lightning), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ashawkey/fantasia3d.unofficial)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/DeepFloyd/IF-I-XL-v1.0), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/huggingface_hub/v0.14.1/guides/download#download-an-entire-repository)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1635cb0/threestudio_a_unified_framework_for_3d_content/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gT8Xvx5b6IE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/threestudio-project/threestudio/blob/main/threestudio.ipynb) | 28.07.2023 |
  • Billy Lamberta
  • Google - 2019-notes01-wordvecs1.pdf)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf)</li><li>[projector](http://projector.tensorflow.org/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/cbow-word2vec), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/skip-gram-word2vec)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Zipf%27s_law)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/text/word2vec.ipynb) | 25.07.2023 |
  • Billy Lamberta
  • James Betker - tts?style=social)](https://github.com/neonbjb/tortoise-tts) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.12092), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.09672), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.07889)</li><li>[examples](https://nonint.com/static/tortoise_v2_examples.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/neonbjb/DL-Art-School)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/patrickvonplaten), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/osanseviero/tortoisse-tts)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J3-jfS29RF4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/neonbjb/tortoise-tts/blob/main/tortoise_tts.ipynb) | 15.07.2023 |
  • Leandro von Werra - human-preferences)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xQ5nc1CF7iQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/67SO20dszNA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/trl/blob/master/examples/notebooks/best_of_n.ipynb) | 14.07.2023 |
  • BigScience - workshop/petals?style=social)](https://github.com/bigscience-workshop/petals) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.01188), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.07258)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/borzunov/chat.petals.ml), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timDettmers/bitsandbytes)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/bloom)</li><li>[project](https://petals.ml/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/BitTorrent)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Ervk6HPNS6AYVr3xVdQnY5a-TjjmLCdQ) | 05.07.2023 |
  • Ian Osband - wen.com/)</li> <li>[Seyed Mohammad Asghari](https://github.com/mohammadasghari)</li> <li>[Vikranth Dwaracherla](https://github.com/dvikranth)</li><details><summary>others</summary><li>[Morteza Ibrahimi](https://github.com/mibrahimi)</li> <li>[Xiuyuan Lu](https://scholar.google.com/citations?user=SPL_2lIAAAAJ)</li> <li>[Benjamin Van Roy](https://web.stanford.edu/~bvr/)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/enn?style=social)](https://github.com/deepmind/enn) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.08924)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/deepminds-epistemic-neural-networks-open-new-avenues-for-uncertainty-modelling-in-large-and-fa83ab00aba3)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/j8an0dKcX4A)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/enn/blob/master/enn/colabs/enn_demo.ipynb) | 26.06.2023 |
  • Alex Shonenkov - ai)</li> <li>[Daria Bakshandaeva](https://github.com/Gugutse)</li> <li>[Christoph Schuhmann](http://christoph-schuhmann.de/)</li><details><summary>others</summary><li>[Ksenia Ivanova](https://github.com/ivksu)</li> <li>[Nadiia Klokova](https://github.com/vauimpuls)</li></ul></details> | [![](https://img.shields.io/github/stars/deep-floyd/IF?style=social)](https://github.com/deep-floyd/IF) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.11487)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/umz62Mgr)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/DeepFloyd), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/optimization/fp16#model-offloading-for-fast-inference-and-memory-savings), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-speed), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-memory), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/if), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/main/en/api/pipelines/if)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/shonenkov/deepfloyd-if-4-3b-generator-of-pictures)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/deepfloydai)</li><li>[website](https://deepfloyd.ai/deepfloyd-if)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4Zkipll5Rjc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tq5ZXZWwTPA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rLtfd1TvYJk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/deepfloyd_if_free_tier_google_colab.ipynb) | 26.06.2023 |
  • Vincent Stimper - cr)</li> <li>[Vincent Berenz](http://vincentberenz.is.tuebingen.mpg.de/)</li><details><summary>others</summary><li>[Lukas Ryll](https://github.com/lukasryll)</li> <li>[Bernhard Schölkopf](https://scholar.google.com/citations?user=DZ-fHPgAAAAJ)</li> <li>[José Miguel Hernández-Lobato](https://jmhl.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/VincentStimper/normalizing-flows?style=social)](https://github.com/VincentStimper/normalizing-flows) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.12014)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://vincentstimper.github.io/normalizing-flows/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/VincentStimper/resampled-base-flows), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/VincentStimper/hmc-hyperparameter-tuning)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Von_Mises_distribution)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/VincentStimper/normalizing-flows/blob/master/examples/paper_example_nsf_colab.ipynb) | 26.06.2023 |
  • Vittorio Caggiano
  • Jade Copet - kant-339a3b1b7)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Alexandre Défossez](https://ai.honu.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/audiocraft?style=social)](https://github.com/facebookresearch/audiocraft) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.05284), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.11325)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/camenduru/MusicGen-colab)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/musicgen-large)</li><li>[project](https://ai.honu.io/papers/musicgen/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/v-YpvPkhdO4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=EGfxuTy9Eeo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/la2fGS0dW98), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/v-YpvPkhdO4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fxGqfg96RBUvGxZ1XXN07s3DthrKUl4-) | 11.06.2023 |
  • Billy Lamberta - embeddings-for-nmt?style=social)](https://github.com/neulab/word-embeddings-for-nmt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.03762), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1903.03878)</li><li>[link](https://deepmind.com/blog/article/alphastar-mastering-real-time-strategy-game-starcraft-ii)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/text/blob/master/docs/tutorials/transformer.ipynb) | 02.06.2023 |
  • Yuxin Wu - detectron2-a-pytorch-based-modular-object-detection-library-/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://detectron2.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/matterport/Mask_RCNN/tree/master/samples/balloon)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) | 26.05.2023 |
  • Bo Peng - anthony.github.io/)</li> <li>[Alon Albalak](https://alon-albalak.github.io/)</li><details><summary>others</summary><li>[Samuel Arcadinho](https://github.com/SSamDav)</li> <li>[Matteo Grella](http://www.matteogrella.com/)</li> <li>[Kranthi Kiran](https://kranthigv.github.io/)</li> <li>[Haowen Hou](https://github.com/howard-hou)</li> <li>[Przemyslaw Kazienko](https://kazienko.eu/en)</li> <li>[Jan Kocon](https://github.com/KoconJan)</li> <li>[Bartlomiej Koptyra](https://github.com/bkoptyra)</li> <li>[Ipsit Mantri](https://ipsitmantri.github.io/)</li> <li>[Ferdinand Mom](https://3outeille.github.io/)</li> <li>[Xiangru Tang](https://github.com/tangxiangru)</li> <li>[Johan Wind](https://johanwind.github.io/)</li> <li>[Stanisław Woźniak](https://www.researchgate.net/profile/Stanislaw-Wozniak-3)</li> <li>[Qihang Zhao](https://www.researchgate.net/profile/Qihang-Zhao-2)</li> <li>[Peng Zhou](https://pengzhou.sites.ucsc.edu/)</li> <li>[Jian Zhu](https://lingjzhu.github.io/)</li> <li>[Rui-Jie Zhu](https://scholar.google.com/citations?user=08ITzJsAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/BlinkDL/ChatRWKV?style=social)](https://github.com/BlinkDL/ChatRWKV) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13048)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/bDSBUMeFpc)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/saharNooby/rwkv.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/harrisonvanderbyl/rwkv-cpp-cuda), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Blealtan/RWKV-LM-LoRA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/josStorer/RWKV-Runner)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/BlinkDL)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/1135aew/r_rwkv4_14b_release_and_chatrwkv_a_surprisingly/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/BlinkDL_AI)</li><li>[website](https://www.rwkv.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/UeAD1qWNb1U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/resloved/RWKV-notebooks/blob/master/RWKV_ChatRWKV.ipynb) | 08.05.2023 |
  • Jake Vanderplas
  • Guangyao Zhou - dedieu)</li> <li>[Miguel Lázaro-Gredilla](https://www.tsc.uc3m.es/~miguel/)</li><details><summary>others</summary><li>[Shrinu Kushagra](https://cs.uwaterloo.ca/~skushagr/)</li> <li>[Dileep George](https://dileeplearning.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/PGMax?style=social)](https://github.com/deepmind/PGMax) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.04110)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Belief_propagation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/PGMax/blob/main/examples/rcn.ipynb) | 05.05.2023 |
  • Stability AI - AI/StableLM?style=social)](https://github.com/Stability-AI/StableLM) <ul><li>[blog post](https://stability.ai/blog/stability-ai-launches-the-first-of-its-stablelm-suite-of-language-models)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/llama), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tatsu-lab/stanford_alpaca), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nomic-ai/gpt4all), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/databrickslabs/dolly), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/anthropics/hh-rlhf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/lmsys/vicuna-13b-delta-v0), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/RyokoAI/ShareGPT52K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dypPSs4t77g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nWf1StvtoRw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Hg-s2RTaTFE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qXtJjoEfTnA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stability-AI/StableLM/blob/main/notebooks/stablelm-alpha.ipynb) | 27.04.2023 |
  • Eren Gölge - AlJafari](https://github.com/Aya-AlJafari)</li> <li>[Edresson Casanova](https://github.com/Edresson)</li> <li>[Josh Meyer](http://jrmeyer.github.io/)</li><details><summary>others</summary><li>[Kelly Davis](https://github.com/kdavis-coqui)</li> <li>[Reuben Morais](https://github.com/reuben)</li></ul></details> | [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS) <ul><li>[blog post](https://coqui.ai/blog/tts/solving-attention-problems-of-tts-models-with-double-decoder-consistency)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://tts.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/coqui-ai/TTS-papers)</li><li>[samples](https://erogol.github.io/ddc-samples/)</li><li>[website](https://coqui.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ADnBCz0Wd1U), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Yglxf2WbkLU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/alpI-DnVlO0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/coqui-ai/TTS/blob/dev/notebooks/Tutorial_2_train_your_first_TTS_model.ipynb) | 26.04.2023 |
  • Ross Wightman - 5b/), [data](https://laion.ai/blog/laion-400-open-dataset/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mlfoundations/wise-ft), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/webdataset/webdataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/webdataset/tarp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research-datasets/conceptual-12m)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion2B-en), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-B-32-laion2B-s34B-b79K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-L-14-laion2B-s32B-b82K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-H-14-laion2B-s32B-b79K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-g-14-laion2B-s12B-b42K)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mlfoundations/open_clip/blob/master/docs/Interacting_with_open_clip.ipynb) | 16.04.2023 |
  • Antonin Raffin - a.me/)</li> <li>[Adam Gleave](https://www.gleave.me/)</li> <li>[Anssi Kanervisto](https://github.com/Miffyli)</li><details><summary>others</summary><li>[Maximilian Ernestus](https://github.com/ernestum)</li> <li>[Noah Dormann](https://github.com/ndormann)</li></ul></details> | [![](https://img.shields.io/github/stars/DLR-RM/stable-baselines3?style=social)](https://github.com/DLR-RM/stable-baselines3) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://stable-baselines3.readthedocs)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Stable-Baselines-Team/stable-baselines3-contrib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hill-a/stable-baselines), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/gym/wiki/Environments)</li><li>[paper](https://jmlr.org/papers/v22/20-1364.html)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/reinforcementlearning/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLQVvvaa0QuDf0O2DWwLZBfJeYY-JOeZB1)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stable-Baselines-Team/rl-colab-notebooks/blob/sb3/stable_baselines_getting_started.ipynb) | 14.04.2023 |
  • Antonin Raffin - RM/rl-baselines3-zoo?style=social)](https://github.com/DLR-RM/rl-baselines3-zoo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05719)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://stable-baselines3.readthedocs.io/en/master/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DLR-RM/rl-baselines3-zoo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/roboschool), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/Minigrid)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sb3)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stable-Baselines-Team/rl-colab-notebooks/blob/sb3/rl-baselines-zoo.ipynb) | 14.04.2023 |
  • IDEA-Research - Research/Grounded-Segment-Anything?style=social)](https://github.com/IDEA-Research/Grounded-Segment-Anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.02643), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05499)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/MasterBin-IIAU/UNINEXT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OSX), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dvlab-research/VoxelNeXt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Semantic-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OpenSeeD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Computer-Vision-in-the-Wild/CVinW_Readings), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/sail-sg/EditAnything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/feizc/IEA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Li-Qingyun/sam-mmrotate), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/VainF/Awesome-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RockeyCoss/Prompt-Segment-Anything)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oEQYStnF2l8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gKTYMfwPo4M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0Fpb8TBH0nM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GuEDDBWrN24)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/betogaona7/Grounded-Segment-Anything/blob/main/grounded_sam_colab_demo.ipynb) | 12.04.2023 |
  • Google - importing-data-wrong-c171f52eea00)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YrMy-BAqk8k), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6th3rahsw9Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3HYy0SPd7TE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MvcK-MaXbHk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/datasets/blob/master/docs/overview.ipynb) | 11.04.2023 |
  • Taku Kudo - smt/mosesdecoder/blob/master/scripts/tokenizer/tokenizer.perl), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rsennrich/subword-nmt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/gperftools/gperftools), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Microsoft/vcpkg)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://jacky2wong.medium.com/understanding-sentencepiece-under-standing-sentence-piece-ac8da59f6b08)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U51ranzJBpY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/sentencepiece/blob/master/python/sentencepiece_python_module_example.ipynb) | 08.04.2023 |
  • Hugging Face
  • Adam Stewart - us/research/people/jlavista/)</li> <li>[Arindam Banerjee](https://arindam.cs.illinois.edu/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/torchgeo?style=social)](https://github.com/microsoft/torchgeo) <ul><li>[NDBI](https://www.linkedin.com/pulse/ndvi-ndbi-ndwi-calculation-using-landsat-7-8-tek-bahadur-kshetri/)</li><li>[NDVI](https://gisgeography.com/ndvi-normalized-difference-vegetation-index/)</li><li>[NDWI](https://custom-scripts.sentinel-hub.com/custom-scripts/sentinel-2/ndwi/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.08872)</li><li>[data](https://docs.sentinel-hub.com/api/latest/data/sentinel-2-l2a/), [data](https://www.cogeo.org/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/davemlz/awesome-spectral-indices)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/torchgeo/blob/main/docs/tutorials/indices.ipynb) | 29.03.2023 |
  • Dongxu Li - wang)</li><details><summary>others</summary><li>[Silvio Savarese](https://scholar.google.com/citations?user=ImpbxLsAAAAJ)</li> <li>[Steven Hoi](https://sites.google.com/view/stevenhoi)</li></ul></details> | [![](https://img.shields.io/github/stars/salesforce/LAVIS?style=social)](https://github.com/salesforce/LAVIS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.09019), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.06500), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.12597), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.10846), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2210.08773)</li><li>[blog post](https://blog.salesforceairesearch.com/lavis-language-vision-library/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://opensource.salesforce.com/LAVIS//latest/index.html)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Merlion)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.ipynb) | 24.03.2023 |
  • Phil Wang - vincent-1958381)</li><details><summary>others</summary><li>[Eugene Kharitonov](https://eugene-kharitonov.github.io/)</li> <li>[Olivier Pietquin](https://research.google/people/105812)</li> <li>[Matt Sharifi](https://scholar.google.com/citations?user=GeQNBz0AAAAJ)</li> <li>[Olivier Teboul](https://scholar.google.com/citations?user=ep0OfyAAAAAJ)</li> <li>[David Grangier](http://david.grangier.info/)</li> <li>[Marco Tagliasacchi](https://scholar.google.com/citations?user=zwH1rZQAAAAJ)</li> <li>[Neil Zeghidour](https://github.com/lienz)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409) [![](https://img.shields.io/github/stars/lucidrains/audiolm-pytorch?style=social)](https://github.com/lucidrains/audiolm-pytorch) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.03143), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.03312), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.02765), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.19466), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.05202), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.02150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.12598), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.13290), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2210.13432), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09883), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.05707), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2210.13438)</li><li>[blog post](https://blog.research.google/2022/10/audiolm-language-modeling-approach-to.html)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/xBPBXfcFHd)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/musiclm-pytorch)</li><li>[project](https://google-research.github.io/seanet/audiolm/examples/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Vucewi_kPEU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/behUbh0koZk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/olNvmUCmY8o)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/lucidrains/audiolm-pytorch/blob/main/audiolm_pytorch_demo.ipynb) | 23.03.2023 |
  • Michael Broughton - o9AhIz1uvo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/quantum/blob/master/docs/tutorials/hello_many_worlds.ipynb) | 20.03.2023 |
  • Billy Lamberta - image-masking-challenge/overview)</li><li>[u-net](https://lmb.informatik.uni-freiburg.de/people/ronneber/u-net/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/segmentation.ipynb) | 17.03.2023 |
  • Jiawei Liu - ng)</li> <li>[Yinlin Deng](https://dengyinlin.github.io/)</li> <li>[Lingming Zhang](http://lingming.cs.illinois.edu/)</li></ul> | [![](https://img.shields.io/github/stars/ise-uiuc/tzer?style=social)](https://github.com/ise-uiuc/tzer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09947)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/repository/docker/tzerbot/oopsla)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://tzer.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ganler/memcov)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ise-uiuc/tzer/blob/main/bug-report.ipynb) | 09.03.2023 |
  • Vijish Madhavan
  • Tom Hennigan - haiku?style=social)](https://github.com/deepmind/dm-haiku) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://dm-haiku.readthedocs.io/en/latest/)</li><li>[website](https://www.haiku-os.org/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm-haiku/blob/main/examples/haiku_lstms.ipynb) | 02.03.2023 |
  • Fatih Cagatay Akyon - badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990) [![](https://img.shields.io/github/stars/obss/sahi?style=social)](https://github.com/obss/sahi) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.06934)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fcakyon/small-object-detection-benchmark)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?pipeline_tag=object-detection&sort=downloads)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/remekkinas/sahi-slicing-aided-hyper-inference-yv5-and-yx)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codable/sahi-a-vision-library-for-performing-sliced-inference-on-large-images-small-objects-c8b086af3b80), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codable/convert-any-dataset-to-coco-object-detection-format-with-sahi-95349e1fe2b7)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/obss/sahi/blob/main/demo/inference_for_yolov5.ipynb) | 23.02.2023 |
  • Luca Costabello - Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2013/hash/b337e84de8752b27eda3a12363109e80-Abstract.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gX_KHaU8ChI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Accenture/AmpliGraph/blob/main/docs/tutorials/AmpliGraphBasicsTutorial.ipynb) | 23.02.2023 |
  • Billy Lamberta
  • Google
  • Yuan Tang - _EId-D0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3bownM3L5zM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tensorboard/blob/master/docs/scalars_and_keras.ipynb) | 10.02.2023 |
  • Jason Roselander - engine/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/federated-learning)</li><li>[shell](https://cloud.google.com/shell/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/high_performance_simulation_with_kubernetes.ipynb) | 31.01.2023 |
  • Damian Stewart - ai/InvokeAI/issues/2832)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cactusfriend/nightmare-invokeai-prompts)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/damian0815/compel/blob/main/compel-demo.ipynb) | 26.01.2023 |
  • Han Xiao - Griffiths](http://blog.alexcg.net/)</li></ul> | [![](https://img.shields.io/github/stars/jina-ai/dalle-flow?style=social)](https://github.com/jina-ai/dalle-flow) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Jack000/glid-3-xl), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jina-ai/docarray)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/CompVis/stable-diffusion-v-1-4-original)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3UBBWOUVhFYRUa_gpYYKBqEAkO4sxmne), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/jina-ai)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jina-ai/dalle-flow/blob/main/client.ipynb) | 26.01.2023 |
  • Hugging Face - Diffusion-Models)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/CompVis/text2img-latent-diffusion), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/CompVis/celeba-latent-diffusion), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/fusing/celeba-diffusion), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/huggingface/diffuse-the-rest), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Shuang59/Composable-Diffusion)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/hugging-face-just-released-the-diffusers-library-846f32845e65)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/UzkdOg7wWmI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/diffusers_intro.ipynb) | 17.01.2023 |
  • Aleksei Petrenko - huang.github.io/)</li> <li>[Tushar Kumar](https://github.com/tushartk)</li> <li>[Gaurav Sukhatme](http://robotics.usc.edu/~gaurav/)</li> <li>[Vladlen Koltun](http://vladlen.info/)</li></ul> | [![](https://img.shields.io/github/stars/alex-petrenko/sample-factory?style=social)](https://github.com/alex-petrenko/sample-factory) <ul><li>[ICML](http://proceedings.mlr.press/v119/petrenko20a.html)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.11751)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://www.samplefactory.dev/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/alex-petrenko/faster-fifo)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/lLG17LKKSZc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/alex-petrenko/sample-factory/blob/master/sf_examples/notebooks/samplefactory_hub_example.ipynb) | 17.01.2023 |
  • Andreas Köpf - schuhmann.de/)</li><details><summary>others</summary><li>[Keith Stevens](https://fozziethebeat.github.io/)</li> <li>[Abdullah Barhoum](https://github.com/AbdBarho)</li> <li>[Nguyen Minh Duc](https://github.com/notmd)</li> <li>[Oliver Stanley](https://olliestanley.github.io/)</li> <li>[James Melvin Ebenezer](https://github.com/melvinebenezer)</li></ul></details> | [![](https://img.shields.io/github/stars/LAION-AI/Open-Assistant?style=social)](https://github.com/LAION-AI/Open-Assistant) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.02155)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://projects.laion.ai/Open-Assistant/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/OpenAssistant)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://generativeai.pub/open-assistant-a-free-and-open-source-alternative-to-chatgpt-67d15229813)</li><li>[website](https://open-assistant.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/64Izfm24FKA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ddG2fM9i4Kk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FQIHLFLrTw0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/LAION-AI/Open-Assistant/blob/main/notebooks/data-augmentation/stackexchange-builder/stackexchange-builder.ipynb) | 14.01.2023 |
  • Oleksii Kuchaiev
  • Gengshan Yang - y/rigidmask), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ShichenLiu/SoftRas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ThibaultGROUEIX/ChamferDistancePytorch)</li><li>[project](https://banmo-www.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1NUa-yvFGA0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jDTy-liFoCQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dQJn1vsuz0DkyRZbOA1SulkVQ0V1kMUP) | 30.12.2022 |
  • Google - 9MYvPwI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MXxN4fv01c8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FsxthdQ_sL4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zEOtG-ChmZE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kBjYK3K3P6M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8j1MWZGNoXM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hszd5UqnfLk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tpu/blob/master/tools/colab/keras_mnist_tpu.ipynb) | 20.12.2022 |
  • Sergio Guadarrama - g.github.io/)</li><details><summary>others</summary><li>[Ethan Holly](https://github.com/eholly-g)</li> <li>[Sam Fishman](http://sam.fish/)</li> <li>[Ke Wang](https://scholar.google.com/citations?user=QRYX59sAAAAJ)</li> <li>[Ekaterina Gonina](https://github.com/egonina)</li> <li>[Neal Wu](https://twitter.com/WuNeal)</li> <li>[Efi Kokiopoulou](https://github.com/efiko)</li> <li>[Luciano Sbaiz](https://scholar.google.com/citations?user=fKBmhcUAAAAJ)</li> <li>[Jamie Smith](https://scholar.google.com/citations?user=jk17mo8AAAAJ)</li> <li>[Gábor Bartók](https://github.com/bartokg)</li> <li>[Jesse Berent](https://www.linkedin.com/in/jesse-berent-a1b6875)</li> <li>[Chris Harris](https://www.linkedin.com/in/charris)</li> <li>[Vincent Vanhoucke](https://vincent.vanhoucke.com/)</li> <li>[Eugene Brevdo](https://ebrevdo.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/tensorflow/agents?style=social)](https://github.com/tensorflow/agents) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://www.tensorflow.org/agents/api_docs/python/tf_agents)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/introduction-to-tf-agents-a-library-for-reinforcement-learning-in-tensorflow-68ab9add6ad6), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/analytics-vidhya/tf-agents-a-flexible-reinforcement-learning-library-for-tensorflow-5f125420f64b)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/agents)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2nKD6zFQ8xI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-TTziY7EmUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/52DTXidSVWc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U7g7-Jzj9qo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tAOApRQAgpc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X4eruXqNbDc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g0yDlAbi6Pc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VmZI_YkfPBM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7QFSziiAnxI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/agents/blob/master/docs/tutorials/0_intro_rl.ipynb) | 15.12.2022 |
  • Matthias Fey - team/pytorch_geometric?style=social)](https://github.com/pyg-team/pytorch_geometric) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1903.02428), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.07829), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1609.02907), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.03123), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.05178), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08566), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10903), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.07953)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pytorch-geometric.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/snap-stanford/ogb/tree/master/examples), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/pyg-team/pyg-lib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_scatter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_sparse), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_cluster), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AntonioLonga/PytorchGeometricTutorial)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2018/hash/e77dbaf6759253c7c6d0efc5690369c7-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2017/hash/5dd9db5e033da9c6fb5ba83c7a7ebea9-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://nips.cc/virtual/2020/public/poster_3fe230348e9a12c13120749e3f9fa4cd.html)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/beginner/basics/optimization_tutorial.html#full-implementation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLGMXrbDNfqTzqxB1IGgimuhtfAhGd8lHF), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLGMXrbDNfqTwPxitLVHEbT9Pd6-oR_cud), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-UjytpbqX4A)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1h3-vJGRVloF5zStxL5I0rSy4ZUPNsjy8) | 08.12.2022 |
  • Anton Emelyanov - forever/ru-gpts?style=social)](https://github.com/ai-forever/ru-gpts) <ul><li>[cristofari](https://sbercloud.ru/ru/christofari)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeedExamples/tree/master/Megatron-LM)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/transformers/main_classes/model.html#transformers.generation_utils.GenerationMixin.generate)</li><li>[sparse attention](https://www.deepspeed.ai/tutorials/sparse-attention/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/ru-gpts/blob/master/examples/ruGPT3XL_generation.ipynb) | 07.12.2022 |
  • Nathan Raw - diffusion-videos?style=social)](https://github.com/nateraw/stable-diffusion-videos) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://gist.github.com/karpathy/00103b0037c5aaea32fe1da1af553355), [<img src="images/git.svg" alt="git" height=20/>](https://gist.github.com/nateraw/c989468b74c616ebbc6474aa8cdd9e53)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nateraw/stable-diffusion-videos/blob/main/stable_diffusion_videos.ipynb) | 05.12.2022 |
  • Craig Macdonald - badge.php?doi=10.1145/3459637.3482013)](https://doi.org/10.1145/3459637.3482013) [![](https://img.shields.io/github/stars/terrier-org/pyterrier?style=social)](https://github.com/terrier-org/pyterrier) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2007.14271)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pyterrier.readthedocs.io)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrier-org/ecir2021tutorial), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_ance), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_colbert), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_pisa), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_t5), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_doc2query), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_deepct)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/terrier-org/pyterrier/blob/master/examples/notebooks/non_en_retrieval.ipynb) | 02.11.2022 |
  • Alexander Kapitanov - theory?style=social)](https://github.com/hukenovs/dsp-theory) <ul><li>[blog post](https://habr.com/ru/articles/460445/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hukenovs/dsp-theory/blob/master/src/dsp_theory_1_signals.ipynb) | 18.10.2022 |
  • Ilya Belikov - Text-to-Music?style=social)](https://github.com/MubertAI/Mubert-Text-to-Music) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mubert2.docs.apiary.io/)</li><li>[project](https://mubert.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YJu0iXn-T_U), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5UsaxJsFvAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/B0kkIpWifG4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ferluht/Mubert-Text-to-Music/blob/main/Mubert_Text_to_Music.ipynb) | 18.10.2022 |
  • Rishabh Agarwal - research/batch_rl?style=social)](https://github.com/google-research/batch_rl) <ul><li>[DQN](https://www.nature.com/articles/nature14236?wm=book_wap_0005)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.04543), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1709.06009)</li><li>[blog post](https://ai.googleblog.com/2020/04/an-optimistic-perspective-on-offline.html)</li><li>[data](https://console.cloud.google.com/storage/browser/atari-replay-datasets), [data](https://research.google/resources/datasets/dqn-replay/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/atari-py/tree/0.2.5/atari_py/atari_roms), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mgbellemare/Arcade-Learning-Environment), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mila-iqia/SGI/blob/master/src/offline_dataset.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kzl/decision-transformer/tree/master/atari)</li><li>[project](https://offline-rl.github.io/)</li><li>[slides](https://docs.google.com/presentation/d/1ROltXr6FIeYKrnGl0tKHGWI0pL4Zo8CnvAK2-cdpQyY)</li><li>[talk](https://slideslive.com/38928373/an-optimistic-perspective-on-offline-deep-reinforcement-learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/install/install_linux)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1ktlNni_vwFpFtCgUez-RHW0OdGc2U_Wv) | 04.10.2022 |
  • Mingxing Tan - badge.php?doi=10.1109/CVPR42600.2020.01079)](https://doi.org/10.1109/CVPR42600.2020.01079) [![](https://img.shields.io/github/stars/google/automl?style=social)](https://github.com/google/automl/tree/master/efficientdet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.09070), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.13886), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.11946), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1804.02767)</li><li>[blog post](https://ai.googleblog.com/2020/04/efficientdet-towards-scalable-and.html)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/tensorflow/fitting-larger-networks-into-memory-583e3c758ff9)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://tfhub.dev/s?network-architecture=efficientdet)</li><li>[tutorial](https://cloud.google.com/tpu/docs/tutorials/efficientnet)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yJg1FX2goCo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OsA3zH5NKYc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qZobxWXlJ0g)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/automl/blob/master/efficientdet/tf2/tutorial.ipynb) | 27.09.2022 |
  • Matt Hoffman - Maron](https://github.com/fastturtle)</li><details><summary>others</summary><li>[Feryal Behbahani](https://feryal.github.io/)</li> <li>[Tamara Norman](https://github.com/tamaranorman)</li> <li>[Abbas Abdolmaleki](https://scholar.google.com/citations?user=cCYTVWQAAAAJ)</li> <li>[Albin Cassirer](https://github.com/acassirer)</li> <li>[Fan Yang](https://github.com/ddmbr)</li> <li>[Kate Baumli](https://github.com/katebaumli)</li> <li>[Sarah Henderson](https://www.linkedin.com/in/sarah-henderson-agilecoach/)</li> <li>[Alex Novikov](https://scholar.google.ru/citations?user=jMUkLqwAAAAJ)</li> <li>[Sergio Gómez Colmenarejo](https://scholar.google.ru/citations?user=0Dkf68EAAAAJ)</li> <li>[Serkan Cabi](https://scholar.google.ru/citations?&user=l-HhJaUAAAAJ)</li> <li>[Caglar Gulcehre](https://www.caglarg.com/)</li> <li>[Tom Le Paine](http://tomlepaine.github.io/)</li> <li>[Andrew Cowie](https://scholar.google.ru/citations?&user=aTvi5mUAAAAJ)</li> <li>[Ziyu Wang](https://ziyuw.github.io/)</li> <li>[Bilal Piot](https://scholar.google.ru/citations?&user=fqxNUREAAAAJ)</li> <li>[Nando de Freitas](https://github.com/nandodf)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/acme?style=social)](https://github.com/deepmind/acme) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.00979)</li><li>[blog post](https://www.deepmind.com/publications/acme-a-new-framework-for-distributed-reinforcement-learning)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://dm-acme.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/dm_env)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NUwDr42bPOw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J1XCWjuyRaI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/pFMuQWpHI5k)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/acme/blob/master/examples/tutorial.ipynb) | 26.09.2022 |
  • Bo Peng - anthony.github.io/)</li> <li>[Alon Albalak](https://alon-albalak.github.io/)</li><details><summary>others</summary><li>[Samuel Arcadinho](https://github.com/SSamDav)</li> <li>[Matteo Grella](http://www.matteogrella.com/)</li> <li>[Kranthi Kiran](https://kranthigv.github.io/)</li> <li>[Haowen Hou](https://github.com/howard-hou)</li> <li>[Przemyslaw Kazienko](https://kazienko.eu/en)</li> <li>[Jan Kocon](https://github.com/KoconJan)</li> <li>[Bartlomiej Koptyra](https://github.com/bkoptyra)</li> <li>[Ipsit Mantri](https://ipsitmantri.github.io/)</li> <li>[Ferdinand Mom](https://3outeille.github.io/)</li> <li>[Xiangru Tang](https://github.com/tangxiangru)</li> <li>[Johan Wind](https://johanwind.github.io/)</li> <li>[Stanisław Woźniak](https://www.researchgate.net/profile/Stanislaw-Wozniak-3)</li> <li>[Qihang Zhao](https://www.researchgate.net/profile/Qihang-Zhao-2)</li> <li>[Peng Zhou](https://pengzhou.sites.ucsc.edu/)</li> <li>[Jian Zhu](https://lingjzhu.github.io/)</li> <li>[Rui-Jie Zhu](https://scholar.google.com/citations?user=08ITzJsAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/BlinkDL/RWKV-LM?style=social)](https://github.com/BlinkDL/RWKV-LM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13048), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.14103), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.05202)</li><li>[data](https://dldata-public.s3.us-east-2.amazonaws.com/simplebooks.zip)</li><li>[demo](https://josephrocca.github.io/rwkv-v4-web/demo/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/bDSBUMeFpc)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/saharNooby/rwkv.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cgisky1980/ai00_rwkv_server), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/harrisonvanderbyl/rwkv-cpp-cuda), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Blealtan/RWKV-LM-LoRA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TheRamU/Fay/blob/main/README_EN.md), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ridgerchu/SpikeGPT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/RWKV-v2-RNN-Pile/tree/main/RWKV-v3), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/SmallInitEmb), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/RWKV-CUDA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/minGPT-tuned)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/BlinkDL), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/BlinkDL/clip-guided-binary-autoencoder)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/umq908/r_rwkvv2rnn_a_parallelizable_rnn_with/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/BlinkDL_AI), [<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/HochreiterSepp/status/1524270961314484227)</li><li>[website](https://www.rwkv.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/x8pW19wKfXQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/B3Qa2rRsaXo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w-xydM6C6Qc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1F7tZoPZaWJf1fsCmZ5tjw6sYHiFOYVWM) | 21.09.2022 |
  • Filippo Vicentini - i-szabo)</li> <li>[Dian Wu](https://github.com/wdphy16)</li><details><summary>others</summary><li>[Christopher Roth](https://github.com/chrisrothUT)</li> <li>[Clemens Giuliani](https://github.com/inailuig)</li> <li>[Gabriel Pescia](https://github.com/gpescia)</li> <li>[Jannes Nys](https://github.com/jwnys)</li> <li>[Vladimir Vargas-Calderón](https://github.com/VolodyaCO)</li> <li>[Nikita Astrakhantsev](https://github.com/nikita-astronaut)</li> <li>[Giuseppe Carleo](https://github.com/gcarleo)</li> <li>[Kenny Choo](https://github.com/kchoo1118)</li> <li>[James Smith](https://jamesetsmith.github.io/)</li> <li>[Tom Westerhout](https://github.com/twesterhout)</li> <li>[Fabien Alet](https://github.com/fabienalet)</li> <li>[Emily Davis](https://github.com/emilyjd)</li> <li>[Stavros Efthymiou](https://github.com/stavros11)</li> <li>[Ivan Glasser](https://www.researchgate.net/profile/Ivan-Glasser)</li> <li>[Sheng-Hsuan Lin](https://shhslin.github.io/)</li> <li>[Marta Mauri](https://github.com/martamau)</li> <li>[Mazzola Guglielmo](https://www.ics.uzh.ch/en/research/research-groups/Guglielmo-Mazzola0.html)</li> <li>[Christian Mendl](http://christian.mendl.net/)</li> <li>[Evert Nieuwenburg](https://evert.info/)</li> <li>[Ossian O'Reilly](https://github.com/ooreilly)</li> <li>[Hugo Théveniaut](https://github.com/theveniaut)</li> <li>[Giacomo Torlai](https://github.com/GTorlai)</li> <li>[Alexander Wietek](https://awietek.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/netket/netket?style=social)](https://github.com/netket/netket) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10526)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://netket.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mpi4jax/mpi4jax), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cloudhan/jax-windows-builder)</li><li>[website](https://www.netket.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ryz-o71tuy8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/PhilipVinc/Lectures/blob/main/2202_NetKet/01_intro.ipynb) | 15.09.2022 |
  • Conor Heins - tschantz)</li> <li>[Beren Millidge](https://www.beren.io/)</li> <li>[Brennan Klein](https://github.com/jkbren)</li><details><summary>others</summary><li>[Arun Niranjan](https://github.com/Arun-Niranjan)</li> <li>[Daphne Demekas](https://github.com/daphnedemekas)</li></ul></details> | [![](https://img.shields.io/github/stars/infer-actively/pymdp?style=social)](https://github.com/infer-actively/pymdp) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.03904)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pymdp-rtd.readthedocs.io/en/stable/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/infer-actively/pymdp/blob/master/docs/notebooks/active_inference_from_scratch.ipynb) | 24.08.2022 |
  • Robin Rombach - qp)</li> <li>[Patrick Esser](https://github.com/pesser)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li></ul> | [![](https://img.shields.io/github/stars/CompVis/stable-diffusion?style=social)](https://github.com/CompVis/stable-diffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.11487), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.12598), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09778), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.01073)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://arxiv.org/abs/2112.10752), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/christophschuhmann/improved-aesthetic-predictor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ShieldMnt/invisible-watermark), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/guided-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/denoising-diffusion-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/x-transformers)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/CompVis), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion2B-en), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion-high-resolution)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/stable-diffusion/blob/main/scripts/latent_imagenet_diffusion.ipynb) | 10.08.2022 |
  • Vighnesh Birodkar - mac)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/models/blob/master/research/object_detection/colab_tutorials/deepmac_colab.ipynb) | 09.08.2022 |
  • Aadesh Gupta - r)</li> <li>[Ashish Shrivastava](https://github.com/ashish3586)</li> <li>[Nagender Aneja](https://researchid.co/naneja)</li> <li>[Zijie Wang](https://zijie.wang/)</li> <li>[Yiwen Shi](https://github.com/Yiwen-Shi)</li> <li>[Afnan Mir](https://github.com/afnanmmir)</li> <li>[William Soto](https://github.com/sotwi)</li> <li>[Chandan Singh](https://csinva.io/)</li> <li>[Claude Roux](https://github.com/ClaudeRoux)</li> <li>[Abinaya Mahendiran](https://github.com/AbinayaM02)</li> <li>[Anna Shvets](https://github.com/asnota)</li> <li>[Kaustubh Dhole](https://github.com/kaustubhdhole)</li> <li>[Bryan Wilie](https://github.com/bryanwilie)</li> <li>[Jamie Simon](https://james-simon.github.io/)</li> <li>[Mukund Varma](https://github.com/MukundVarmaT)</li> <li>[Sang Han](https://github.com/jjangsangy)</li> <li>[Denis Kleyko](https://github.com/denkle)</li> <li>[Samuel Cahyawijaya](https://github.com/SamuelCahyawijaya)</li> <li>[Filip Cornell](https://github.com/Filco306)</li> <li>[Tanay Dixit](https://tanay2001.github.io/)</li> <li>[Connor Boyle](https://github.com/boyleconnor)</li> <li>[Genta Indra Winata](https://gentawinata.com/)</li> <li>[Seungjae Ryan Lee](https://github.com/seungjaeryanlee)</li> <li>[Marcin Namysl](https://github.com/mnamysl)</li> <li>[Roman Sitelew](https://github.com/RomanPlusPlus)</li> <li>[Zhenhao Li](https://zhenhaoli.net/)</li> <li>[Fiona Tan](https://tanfiona.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/GEM-benchmark/NL-Augmenter?style=social)](https://github.com/GEM-benchmark/NL-Augmenter) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.02721)</li><li>[website](https://gem-benchmark.com/nl_augmenter)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/GEM-benchmark/NL-Augmenter/blob/main/notebooks/Write_a_sample_transformation.ipynb) | 06.08.2022 |
  • Hugging Face
  • Jacob Solawetz - to-train-yolov5-on-a-custom-dataset/)</li><li>[data](https://public.roboflow.ai/object-detection/bccd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1gDZ2xcTOgR39tGGs-EZ6i3RTs16wmzZQ) | 20.07.2022 |
  • multimodal.art - diffusion)</li><li>[project](https://multimodal.art/mindseye)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1cg0LZ5OfN9LAIB37Xq49as0fSJxcKtC5) | 06.07.2022 |
  • John Lalor - Graber](https://github.com/ezubaric)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346) [![](https://img.shields.io/github/stars/nd-ball/py-irt?style=social)](https://github.com/nd-ball/py-irt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1908.11421)</li><li>[paper](https://www.frontiersin.org/articles/10.3389/fpsyg.2016.01422/full)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/akUxtt21Mlc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nd-ball/py-irt/blob/master/examples/py-irt_example.ipynb) | 30.06.2022 |
  • Daniil Chesakov - kuznetsov-70ab12127)</li> <li>[Denis Dimitrov](https://github.com/denndimitrov)</li></ul> | [![](https://img.shields.io/github/stars/ai-forever/sber-swap?style=social)](https://github.com/ai-forever/sber-swap) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.03046), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.13457), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1901.08971), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.06340), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05005), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.09965)</li><li>[blog post](https://habr.com/ru/company/sberbank/blog/645919/)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/vgg_face/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14wnxMvD9zsiBQo2FtTpxn6w2cpXCcb-7) | 29.06.2022 |
  • Jaehoon Lee - Dickstein](http://www.sohldickstein.com/)</li> <li>[Vinay Ramasesh](https://ramasesh.github.io/)</li> <li>[Sajant Anand](https://github.com/sajantanand)</li><details><summary>others</summary><li>[Alicia Parrish](https://aliciaparrish.com/)</li> <li>[Ethan Dyer](https://github.com/ethansdyer)</li> <li>[Liam Dugan](http://liamdugan.com/)</li> <li>[Dieuwke Hupkes](https://github.com/dieuwkehupkes)</li> <li>[Daniel Freeman](https://github.com/cdfreeman-google)</li> <li>[Guy Gur-Ari](https://github.com/guygurari)</li> <li>[Aitor Lewkowycz](https://github.com/lewkowycz)</li></ul></details> | [![](https://img.shields.io/github/stars/google/BIG-bench?style=social)](https://github.com/google/BIG-bench) <ul><li>[API](https://google.github.io/BIG-bench/docs/html/bigbench/index.html)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.04615)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/seqio)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/BIG-bench/blob/master/notebooks/colab_examples.ipynb) | 27.06.2022 |
  • Aleksey Korshuk - demo.ipynb) | 25.06.2022 |
  • Chen Chen
  • Balint Pato
  • Han Xiao - ai/clip-as-service?style=social)](https://github.com/jina-ai/clip-as-service) <ul><li>[data](https://sites.google.com/view/totally-looks-like-dataset)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jina-ai/docarray)</li><li>[website](https://clip-as-service.jina.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3UBBWOUVhFYRUa_gpYYKBqEAkO4sxmne), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/jina-ai)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jina-ai/clip-as-service/blob/main/docs/hosting/cas-on-colab.ipynb) | 19.06.2022 |
  • Han Xiao - ai/jina?style=social)](https://github.com/jina-ai/jina) <ul><li>[data](https://sites.google.com/view/totally-looks-like-dataset)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.jina.ai/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jina-ai/example-grafana-prometheus/blob/main/grafana-dashboards/flow.json)</li><li>[hub](https://hub.jina.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3UBBWOUVhFYRUa_gpYYKBqEAkO4sxmne), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/jina-ai)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/docs/Using_Jina_on_Colab.ipynb) | 11.06.2022 |
  • Jacob Kahn - gAAAAJ)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Ronan Collobert](https://ronan.collobert.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/flashlight/flashlight?style=social)](https://github.com/flashlight/flashlight) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12465)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/flml/flashlight/tags?page=1&ordering=last_updated&name=cuda-latest)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://fl.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/arrayfire/arrayfire), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/vcpkg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/arrayfire/arrayfire-ml/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nvidia/cub), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/USCiLab/cereal), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nothings/stb), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookincubator/gloo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/oneapi-src/oneDNN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/glog), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/gflags/gflags), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/flashlight/text)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/flashlight/flashlight/blob/master/flashlight/app/asr/tutorial/notebooks/FinetuneCTC.ipynb) | 01.06.2022 |
  • Elena Samuylova - dral)</li> <li>[Olga Filippova](https://github.com/0lgaF)</li></ul> | [![](https://img.shields.io/github/stars/evidentlyai/evidently?style=social)](https://github.com/evidentlyai/evidently) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.evidentlyai.com/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/0lgaF/my_tab_with_evidently)</li><li>[website](https://evidentlyai.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/EvidentlyAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L4Pv6ExBQPM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Dd6ZzIgeBYkD_4bqWZ0RAdUpCU0b6Y6H) | 30.05.2022 |
  • Caglar Gulcehre - Arnold](http://gabe.squirrelsoup.net/)</li> <li>[Jerry Li](https://github.com/jerryli27)</li> <li>[Mohammad Norouzi](https://norouzi.github.io/)</li> <li>[Matt Hoffman](https://www.mwhoffman.com/)</li> <li>[Ofir Nachum](https://scholar.google.com/citations?user=C-ZlBWMAAAAJ)</li> <li>[George Tucker](https://sites.google.com/view/gjt)</li> <li>[Nicolas Heess](https://scholar.google.com/citations?user=79k7bGEAAAAJ)</li> <li>[Nando de Freitas](https://github.com/nandodf)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/deepmind-research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/rl_unplugged) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.13888), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.04543), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1709.06009), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.09656), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.11711), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.12238), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.09451), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.00690), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.11881), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.09575)</li><li>[data](https://console.cloud.google.com/storage/browser/rl_unplugged)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/lab), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/realworldrl_suite#installation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/n8yNYzbUMJ0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind_research/blob/master/rl_unplugged/dmlab_r2d2.ipynb) | 26.05.2022 |
  • Mostafa Dehghani - badge.php?doi=10.1109/CVPR52688.2022.02070)](https://doi.org/10.1109/CVPR52688.2022.02070) [![](https://img.shields.io/github/stars/google-research/scenic?style=social)](https://github.com/google-research/scenic) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.11403)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/google-open-sources-scenic-a-jax-library-for-rapid-computer-vision-model-prototyping-and-894dbdeddbae)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/deeplearning/comments/qgyjck/r_google_opensources_scenic_a_jax_library_for/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/scenic/blob/main/scenic/common_lib/colabs/scenic_playground.ipynb) | 04.05.2022 |
  • Billy Lamberta - effectiveness/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/text-generation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/text/blob/master/docs/tutorials/text_generation.ipynb) | 02.05.2022 |
  • Kevin Frans - exploring-text-to-drawing-synthesis/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BachiLi/diffvg/blob/master/apps/painterly_rendering.py)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kvfrans/clipdraw/blob/main/clipdraw.ipynb) | 28.04.2022 |
  • Erik Nijkamp - tu.github.io/)</li><details><summary>others</summary><li>[Huan Wang](https://huan-december.github.io/)</li> <li>[Yingbo Zhou](https://scholar.google.com/citations?user=H_6RQ7oAAAAJ)</li> <li>[Silvio Savarese](https://cvgl.stanford.edu/silvio/)</li> <li>[Caiming Xiong](http://cmxiong.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/salesforce/CodeGen?style=social)](https://github.com/salesforce/CodeGen) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13474), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.02309)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/jaxformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?search=salesforce+codegen)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fQI8OgzMAR0bquCrvhlAtXSw6iMFbVgI) | 23.04.2022 |
  • Dennis Ulmer - badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266) [![](https://img.shields.io/github/stars/Kaleidophon/deep-significance?style=social)](https://github.com/Kaleidophon/deep-significance) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.06815)</li><li>[blog post](https://machinelearningmastery.com/statistical-hypothesis-tests/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://deep-significance.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/replicability-analysis-NLP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/testSignificanceNLP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/DeepComparison)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Multiple_comparisons_problem)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Kaleidophon/deep-significance/blob/main/paper/deep-significance%20demo.ipynb) | 12.04.2022 |
  • Billy Lamberta - learning/glossary/#recurrent_neural_network)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/text-classification)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/text/blob/master/docs/tutorials/text_classification_rnn.ipynb) | 17.03.2022 |
  • Sabela Ramos - vincent-1958381)</li><details><summary>others</summary><li>[Hanna Yakubovich](https://github.com/yakubanna)</li> <li>[Daniel Toyama](https://github.com/kenjitoyama)</li> <li>[Anita Gergely](https://www.linkedin.com/in/anita-g-318064b2/)</li> <li>[Piotr Stanczyk](https://scholar.google.com/citations?user=fKVK0dYAAAAJ)</li> <li>[Raphaël Marinier](https://github.com/RaphaelMarinier)</li> <li>[Jeremiah Harmsen](https://github.com/jharmsen)</li> <li>[Olivier Pietquin](https://research.google/people/105812/)</li> <li>[Nikola Momchev](https://scholar.google.com/citations?user=PbWgaswAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/google-research/rlds?style=social)](https://github.com/google-research/rlds) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.02767)</li><li>[blog post](https://ai.googleblog.com/2021/12/rlds-ecosystem-to-generate-share-and.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/envlogger), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/rlds-creator), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/D4RL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/dm_env/blob/master/docs/index.md)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](http://www.tensorflow.org/datasets/catalog/overview), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/robosuite_panda_pick_place_can), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/locomotion), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/mt_opt), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/external_tfrecord?hl=en#load_dataset_with_tfds), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/data), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/data_performance#optimize_performance), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#shuffle), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/splits), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/api_docs/python/tfds/load)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/rlds/blob/main/rlds/examples/rlds_tutorial.ipynb) | 16.03.2022 |
  • Corentin Jemine - Ochir Tuguldur](https://github.com/tugstugi)</li></ul> | [![](https://img.shields.io/github/stars/CorentinJ/Real-Time-Voice-Cloning?style=social)](https://github.com/CorentinJ/Real-Time-Voice-Cloning) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1806.04558), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.08435), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.10135), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10467)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fatchord/WaveRNN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/coqui-ai/tts), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/resemble-ai/Resemblyzer)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-O_hYhToKoA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tugstugi/dl-colab-notebooks/blob/master/notebooks/RealTimeVoiceCloning.ipynb) | 07.03.2022 |
  • Junnan Li - li/home)</li> <li>[Caiming Xiong](http://cmxiong.com/)</li> <li>[Steven Hoi](https://sites.google.com/view/stevenhoi)</li></ul> | [![](https://img.shields.io/github/stars/salesforce/BLIP?style=social)](https://github.com/salesforce/BLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12086)</li><li>[blog post](https://blog.salesforceairesearch.com/blip-bootstrapping-language-image-pretraining/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/ALPRO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dmlc/decord), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/ALBEF), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models/tree/main/timm)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X2k7n4FuI7c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/salesforce/BLIP/blob/main/demo.ipynb) | 02.03.2022 |
  • Wilson Yan
  • Silero team - models?style=social)](https://github.com/snakers4/silero-models) <ul><li>[STT](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/), [STT](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/), [STT](https://habr.com/ru/post/519562/)</li><li>[TTS](https://habr.com/ru/post/660571/), [TTS](https://habr.com/ru/post/549482/)</li><li>[Text Enhancement](https://habr.com/ru/post/581960/)</li><li>[VAD](https://thegradient.pub/one-voice-detector-to-rule-them-all/), [VAD](https://habr.com/ru/post/537276/)</li><li>[website](https://www.silero.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/snakers4/silero-models/blob/master/examples.ipynb) | 27.02.2022 |
  • Alexander Spirin
  • Eugene Kharitonov - lee)</li> <li>[Ali Elkahky](https://scholar.google.com/citations?user=KB3S8RoAAAAJ)</li> <li>[Wei-Ning Hsu](https://wnhsu.github.io/)</li> <li>[Abdelrahman Mohamed](https://ai.facebook.com/people/abdelrahman-mohamed/)</li> <li>[Emmanuel Dupoux](http://www.lscp.net/persons/dupoux/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/textlesslib?style=social)](https://github.com/facebookresearch/textlesslib) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.07359)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/waveglow), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/keithito/tacotron), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/tacotron2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/pseeth/torch-stft)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/librispeech)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/textlesslib/blob/main/examples/resynthesis_and_continuation.ipynb) | 15.02.2022 |
  • Bowen Shi - Ning Hsu](http://people.csail.mit.edu/wnhsu/)</li> <li>[Kushal Lakhotia](https://about.me/hikushalhere)</li> <li>[Abdelrahman Mohamed](http://www.cs.toronto.edu/~asamir/)</li></ul> | [![](https://img.shields.io/github/stars/facebookresearch/av_hubert?style=social)](https://github.com/facebookresearch/av_hubert) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.02184), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.01763), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.04805), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.04890)</li><li>[blog post](https://ai.facebook.com/blog/ai-that-understands-speech-by-looking-as-well-as-hearing/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1bNXkfpHiVHzXQH8WjGhzQ-fsDxolpUjD) | 12.02.2022 |
  • Jonathan Shen
  • Alex Shonenkov - ai)</li></ul> | [![](https://img.shields.io/github/stars/ai-forever/ru-dolph?style=social)](https://github.com/ai-forever/ru-dolph) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.14165), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.12092), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1gmTDA13u709OXiAeXWGm7sPixRhEJCga) | 14.01.2022 |
  • Billy Lamberta - going-deeper-into-neural.html)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Inception)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/generative/deepdream.ipynb) | 13.01.2022 |
  • Ben Trevett - gradient-descent/)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/vision/stable/transforms.html#transforms-on-pil-image-only), [<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/vision/stable/transforms.html#transforms-on-torch-tensor-only)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Multilayer_perceptron)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bentrevett/pytorch-image-classification/blob/master/1_mlp.ipynb) | 26.12.2021 |
  • Ben Trevett - lr-finder?style=social)](https://github.com/davidtvs/pytorch-lr-finder) <ul><li>[ILSVRC](https://image-net.org/challenges/LSVRC/)</li><li>[LR](https://sgugger.github.io/how-do-you-find-a-good-learning-rate.html)</li><li>[PMLR](https://proceedings.mlr.press/v9/glorot10a.html)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1409.0575)</li><li>[cifar-10](https://www.cs.toronto.edu/~kriz/cifar.html)</li><li>[dropout](https://sebastianraschka.com/faq/docs/dropout-activation.html)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2012/hash/c399862d3b9d6b76c8436e924a68c45b-Abstract.html)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/vision/stable/models.html)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/alexnet)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Regularization_(mathematics)), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/AlexNet)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bentrevett/pytorch-image-classification/blob/master/3_alexnet.ipynb) | 26.12.2021 |
  • Ben Trevett - net.org/challenges/LSVRC/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1409.1556), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1506.01186), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.06146), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1502.03167), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1805.11604)</li><li>[cifar-10](https://www.cs.toronto.edu/~kriz/cifar.html)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/vision/stable/models.html)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/vgg)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HR0lt1hlR6U?t=5900), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/j1jIoHN3m0s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RNnKtNrsrmg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bentrevett/pytorch-image-classification/blob/master/4_vgg.ipynb) | 26.12.2021 |
  • Ben Trevett - networks/)</li><li>[LeNet-5](http://yann.lecun.com/exdb/lenet/)</li><li>[guide](https://adeshpande3.github.io/A-Beginner%27s-Guide-To-Understanding-Convolutional-Neural-Networks/)</li><li>[paper](http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/lenet)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Convolution), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Sobel_operator), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Gaussian_blur)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bentrevett/pytorch-image-classification/blob/master/2_lenet.ipynb) | 26.12.2021 |
  • bazanovvanya - forever/music-composer?style=social)](https://github.com/ai-forever/music-composer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.05858)</li><li>[blog post](https://habr.com/ru/company/sberbank/blog/583592/)</li><li>[data](https://magenta.tensorflow.org/datasets/maestro), [data](https://colinraffel.com//projects/lmd/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/gwinndr/MusicTransformer-Pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/bytedance/GiantMIDI-Piano), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mdeff/fma)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/music-composer/blob/master/src/Music_Composer_Demo_Colab_en.ipynb) | 20.12.2021 |
  • Chi Wang - wu.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/microsoft/FLAML?style=social)](https://github.com/microsoft/FLAML) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.04815), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.01571)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://microsoft.github.io/FLAML/)</li><li>[paper](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCfU0zfFXHXdAd5x-WvFBk5A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/euXpDYGgkGM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/FLAML/blob/master/notebook/flaml_automl.ipynb) | 17.12.2021 |
  • Chris Cummins - eth.github.io/)</li> <li>[Brandon Cui](https://www.linkedin.com/in/bcui19/)</li><details><summary>others</summary><li>[Jason Ansel](https://jasonansel.com/)</li> <li>[Sahir Gomez](https://github.com/sahirgomez1)</li> <li>[Olivier Teytaud](https://github.com/teytaud)</li> <li>[Benoit Steiner](http://bsteiner.info/)</li> <li>[Yuandong Tian](http://yuandong-tian.com/)</li> <li>[Hugh Leather](https://github.com/hughleat)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/CompilerGym?style=social)](https://github.com/facebookresearch/CompilerGym) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.08267)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/CompilerGym/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/CompilerGym/blob/development/examples/getting-started.ipynb) | 16.11.2021 |
  • Phil Wang - pytorch?style=social)](https://github.com/lucidrains/reformer-pytorch) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2001.04451), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.01470), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.05895), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.11556), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.02150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.05202), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.05997), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.04887), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.07028), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.03404), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.09864)</li><li>[blog post](https://ai.googleblog.com/2020/01/reformer-efficient-transformer.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/routing-transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/sinkhorn-transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/performer-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/linear-attention-transformer/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/compressive-transformer-pytorch)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2019/hash/9d8df73a3cfbf3c5b47bc9b50f214aff-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/i4H0kjxrias), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Kf3x3lqf9cQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0eTULzrOztQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1awNgXYtjvUeXl1gS-v1iyDXTJJ-fyJIK) | 07.11.2021 |
  • Alex Shonenkov - forever/ru-dalle?style=social)](https://github.com/ai-forever/ru-dalle) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/bes-dev/vqvae_dwt_distiller.pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/boomb0om/Real-ESRGAN-colab)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/multimodalart/rudalle)</li><li>[project](https://rudalle.ru/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/ru-dalle/blob/master/jupyters/ruDALLE-example-generation-A100.ipynb) | 03.11.2021 |
  • Cameron Smith - style-tf?style=social)](https://github.com/cysmith/neural-style-tf) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1604.08610), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.05897), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1508.06576)</li><li>[cvpr](https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Gatys_Image_Style_Transfer_CVPR_2016_paper.pdf)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pastiche), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/The_Starry_Night), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/YUV), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Lab_color_space), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/YCbCr), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/CIELUV), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pareidolia)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14aJ7HQPbcP0sNRIY-FRO4u6lxtlyyxI_) | 01.10.2021 |
  • Katherine Crowson - bomze)</li></ul> | [![](https://img.shields.io/github/stars/chigozienri/VQGAN-CLIP-animations?style=social)](https://github.com/chigozienri/VQGAN-CLIP-animations) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCToztRy9FSTIhEen_1x4FAw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tg-bomze/collection-of-notebooks/blob/master/Text2Animation.ipynb) | 29.09.2021 |
  • Mingxing Tan
  • Anurag Pratik - Qian)</li> <li>[Yuxuan Sun](https://github.com/snyxan)</li> <li>[Ryan Drew](https://rdrew.dev/)</li> <li>[Sara Elkafrawy](https://github.com/saraEbrahim)</li> <li>[Anoushka Tiwari](https://www.linkedin.com/in/anoushka-tiwari)</li> <li>[Tucker Hart](https://www.linkedin.com/in/tucker-hart-05a638133)</li> <li>[Mary Williamson](https://scholar.google.com/citations?user=Ys4xB-QAAAAJ)</li> <li>[Abhinav Gupta](http://www.cs.cmu.edu/~abhinavg/)</li> <li>[Arthur Szlam](https://scholar.google.com/citations?user=u3-FxUgAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/droidlet?style=social)](https://github.com/facebookresearch/droidlet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.10384), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.08584)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/droidlet/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/droidlet/blob/master/examples_and_tutorials/tutorials/droidlet_for_physical_robots.ipynb) | 15.09.2021 |
  • Ben Wang - me/)</li> <li>[Janko Prester](https://www.jankoprester.com/)</li></ul> | [![](https://img.shields.io/github/stars/kingoflolz/mesh-transformer-jax?style=social)](https://github.com/kingoflolz/mesh-transformer-jax) <ul><li>[The Pile](https://pile.eleuther.ai/)</li><li>[blog post](https://arankomatsuzaki.wordpress.com/2021/06/04/gpt-j/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/EleutherAI/gpt-neox), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed)</li><li>[web demo](https://6b.eleuther.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kingoflolz/mesh-transformer-jax/blob/master/colab_demo.ipynb) | 15.09.2021 |
  • Тимчишин Віталій - 1l4XYhrIyS6A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-RdOwhmqP5s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/R13BD8qKeTg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZkjP5RJLQF4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J4Wdy0Wc_xQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mBcLRGuAFUk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YIGtalP1mv0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Yz5pySyEtsU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/x5zLaWT5KPs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yBwpo-L80Mc), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3FW7Lu3i5JvHM8ljYj-zLfQRF3EO8sYv)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/fbeilstein/machine_learning/blob/master/lecture_01_introduction.ipynb) | 02.09.2021 |
  • Mikael Alafriz - sonic-dreams?style=social)](https://github.com/mikaelalafriz/lucid-sonic-dreams) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/justinpinkney/awesome-pretrained-stylegan2)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/introducing-lucid-sonic-dreams-sync-gan-art-to-music-with-a-few-lines-of-python-code-b04f88722de1)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/l-nGC-ve7sI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Y5i50xSFIuN3V4Md8TB30_GOAtts7RQD) | 24.08.2021 |
  • Max Woolf - neural-networks/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=RW7mP6BfZuY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1mMKGnVxirJnqDViH7BDJxFqWrsXlPSoK) | 13.07.2021 |
  • nvidia - up-deep-learning-inference-using-tensorrt-updated/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.nvidia.com/deeplearning/tensorrt/)</li><li>[forum](https://forums.developer.nvidia.com/c/ai-data-science/deep-learning/tensorrt)</li><li>[website](https://developer.nvidia.com/tensorrt)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TU5BMU6iYZ0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6rZNLaS775w), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/G_KhUFCUSsY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7kJ-jph9gCw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/NVIDIA/TensorRT/blob/main/quickstart/IntroNotebooks/0.%20Running%20This%20Guide.ipynb) | 10.06.2021 |
  • Xintao Wang - ntu.com/person/ccloy/)</li> <li>[Chao Dong](https://scholar.google.com/citations?user=OSDCB0UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/XPixelGroup/BasicSR?style=social)](https://github.com/XPixelGroup/BasicSR) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.02181)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://basicsr.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/ESRGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xindongzhang/ECBSR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lotayou/Face-Renovation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/csxmli2016/DFDNet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyView), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyFigure), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/SFTGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/DNI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyCrawler), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyWriting)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KaMYsxWkmww)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JQScYICvEC3VqaabLu-lxvq9h7kSV1ML) | 07.06.2021 |
  • James Bergstra - WgLkAAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/hyperopt/hyperopt?style=social)](https://github.com/hyperopt/hyperopt) <ul><li>[ICML](https://proceedings.mlr.press/v28/bergstra13.html)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](http://hyperopt.github.io/hyperopt/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-sklearn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-nnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-nnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-convnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-gpsmbo)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2011/hash/86e8f7ab32cfd12577bc2619bc635690-Abstract.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Mp1xnPfE4PY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tdwgR1AqQ8Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tteE_Vtmrv4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hyperopt/hyperopt/blob/master/tutorial/01.BasicTutorial.ipynb) | 01.06.2021 |
  • Billy Lamberta - learning/glossary/#convolutional_neural_network)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/cnn.ipynb) | 21.05.2021 |
  • Max Woolf - rnn/master/data/tinyshakespeare/input.txt)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.aitextgen.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/144MdX5aLqrQ3-YW-po81CQMrD6kpgpYh) | 17.05.2021 |
  • Max Woolf - rnn/master/data/tinyshakespeare/input.txt)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.aitextgen.io/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/text-generation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/15qBZx5y9rdaQSyWpsreMDnTiZ5IlN0zD) | 17.05.2021 |
  • Nils Reimers - NLP/Opus-MT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairseq/tree/main/examples/multilingual)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1X47vgSiOphpxS5w_LPtjQgJmiSTNfRNC) | 26.04.2021 |
  • Vijish Madhavan
  • Zeyu Chen - zxf)</li><details><summary>others</summary><li>[Jinxuan Qiu](https://github.com/kinghuin)</li> <li>[Yuhan Shen](https://github.com/ShenYuhan)</li> <li>[Yuying Hao](https://github.com/haoyuying)</li> <li>[Xiaojie Chen](https://github.com/KPatr1ck)</li></ul></details> | [![](https://img.shields.io/github/stars/PaddlePaddle/PaddleHub?style=social)](https://github.com/PaddlePaddle/PaddleHub) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://paddlehub.readthedocs.io/en)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleOCR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleDetection), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CMU-Perceptual-Computing-Lab/openpose), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleSeg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleClas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/ERNIE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/baidu/LAC), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/baidu/DDParser), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleSpeech)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/PaddlePaddle)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/analytics-vidhya/paddlehub-fdd1ec75a07b)</li><li>[website](https://www.paddlepaddle.org.cn/en)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9adXuF_lTSg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/PaddlePaddle/PaddleHub/blob/develop/demo/serving/bentoml/cloud-native-model-serving-with-bentoml.ipynb) | 20.04.2021 |
  • Silvia Terragni - fersini)</li> <li>[Antonio Candelieri](https://www.unimib.it/antonio-candelieri)</li> <li>[Pietro Tropeano](https://github.com/pietrotrope)</li><details><summary>others</summary><li>[Bruno Galuzzi](https://github.com/brunoG89)</li> <li>[Lorenzo Famiglini](https://github.com/lorenzofamiglini)</li> <li>[Davide Pietrasanta](https://github.com/davidepietrasanta)</li></ul></details> | [![](https://img.shields.io/github/stars/mind-Lab/octis?style=social)](https://github.com/mind-Lab/octis) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.01488)</li><li>[data](https://www.dbpedia.org/resources/ontology/), [data](https://www.statmt.org/europarl/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/estebandito22/PyTorchAVITM)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/a-beginners-guide-to-octis-optimizing-and-comparing-topic-models-is-simple-590554ec9ba6), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/a-beginners-guide-to-octis-vol-2-optimizing-topic-models-1214e58be1e5)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2000/hash/f9d1152547c0bde01830b7e8bd60024c-Abstract.html)</li><li>[paper](https://aclanthology.org/2021.eacl-demos.31/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/20-newsgroups)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nPmiWBFFJ8E)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/MIND-Lab/OCTIS/blob/master/examples/OCTIS_Optimizing_CTM.ipynb) | 19.04.2021 |
  • Haoqi Fan - wA73gAAAAJ)</li> <li>[Aaron Adcock](https://scholar.google.com/citations?&user=oa78zHUAAAAJ)</li> <li>[Wan-Yen Lo](https://github.com/wanyenlo)</li> <li>[Christoph Feichtenhofer](http://feichtenhofer.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3474085.3478329)](https://doi.org/10.1145/3474085.3478329) [![](https://img.shields.io/github/stars/facebookresearch/pytorchvideo?style=social)](https://github.com/facebookresearch/pytorchvideo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09887), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.11227)</li><li>[blog post](https://ai.facebook.com/blog/pytorchvideo-a-deep-learning-library-for-video-understanding/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pytorchvideo.readthedocs.io/en/latest/index.html)</li><li>[website](https://github.com/facebookresearch/pytorchvideo)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/b7-gnpqz9Qg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/pytorchvideo/blob/main/tutorials/accelerator/Build_your_model_with_PytorchVideo_Accelerator.ipynb) | 13.04.2021 |
  • EleutherAI - neo?style=social)](https://github.com/EleutherAI/gpt-neo) <ul><li>[GPT-2](https://openai.com/blog/better-language-models/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.14165), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.05150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1701.06538)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/mesh), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/EleutherAI/gpt-neox/)</li><li>[pretrained](https://the-eye.eu/public/AI/gptneo-release/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/EleutherAI/GPTNeo/blob/master/GPTNeo_example_notebook.ipynb) | 28.03.2021 |
  • Billy Lamberta
  • Phil Wang - sleep?style=social)](https://github.com/lucidrains/big-sleep) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1809.11096)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/CLIP)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/bigsleep/comments/lxawb4/how_to_use_some_of_the_newer_features_of/), [<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/bigsleep/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1MEWKbm-driRNF8PrU7ogS5o3se-ePyPb) | 17.03.2021 |
  • Phil Wang - daze?style=social)](https://github.com/lucidrains/deep-daze) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.09661)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/CLIP)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/deepdaze/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_YOHdORb0Fg1Q7vWZ_KlrtFe9Ur3pmVj) | 17.03.2021 |
  • Billy Lamberta - dataset)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/generative/dcgan.ipynb) | 12.03.2021 |
  • Billy Lamberta - net.org/)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/applications/MobileNetV2)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/generative/adversarial_fgsm.ipynb) | 12.03.2021 |
  • Ali Jahanian - design/gan_steerability?style=social)](https://github.com/ali-design/gan_steerability) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.07171), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1809.11096)</li><li>[project](https://ali-design.github.io/gan_steerability/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nS0V64sF7Cw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1kn6yG8PqD1U2bUcy32V1iAVjzlcQWcG3) | 04.03.2021 |
  • Google - discuss)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://trax-ml.readthedocs.io/en/latest/)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/abhinavwalia95/entity-annotated-corpus), [<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/dschettler8845/exploration-of-trax-framework)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/get-started-with-google-trax-for-nlp-ff8dcd3119cf), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/analytics-vidhya/brief-view-of-googles-trax-library-b78eae008cb6)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/overview), [<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.org/guide/tf_numpy)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qlTsaHAtJBY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/trax/blob/master/trax/intro.ipynb) | 17.02.2021 |
  • Ian Osband - lattimore.com/)</li> <li>[Csaba Szepesvari](https://sites.ualberta.ca/~szepesva/)</li> <li>[Satinder Singh](http://web.eecs.umich.edu/~baveja/)</li> <li>[Benjamin Van Roy](https://web.stanford.edu/~bvr/)</li> <li>[Richard Sutton](http://www.incompleteideas.net/)</li> <li>[David Silver](https://www.davidsilver.uk/)</li> <li>[Hado Van Hasselt](https://hadovanhasselt.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/bsuite?style=social)](https://github.com/deepmind/bsuite) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/gym)</li><li>[paper](https://openreview.net/forum?id=rygf-kSYwH)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Wcv4eU_qtZU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1rU20zJ281sZuMD1DHbsODFr1DbASL0RH) | 13.02.2021 |
  • Rama Kumar
  • Vijish Madhavan - Me?style=social)](https://github.com/vijishmadhavan/Toon-Me) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10196), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1707.02921), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1603.08155)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vijishmadhavan/Light-Up/blob/master/Toon_Me_(Try_it_on_Colab).ipynb) | 22.01.2021 |
  • Chase Roberts
  • Romain Hennequin - spleeter-deezer-r-d-source-separation-engine-2b88985e797e)</li><li>[data](https://sigsep.github.io/datasets/musdb.html)</li><li>[project](https://research.deezer.com/projects/spleeter.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deezer/spleeter/blob/master/spleeter.ipynb) | 10.01.2021 |
  • Erwin Coumans - 7nkNCfoEKap4z3qadLVj8QB4a), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9p0O941opGc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kZxPaGdoSJY), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL9LUFPiB6N3YrS0O7XM_1sBVWRnSRB643)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bulletphysics/bullet3/blob/master/examples/pybullet/notebooks/HelloPyBullet.ipynb) | 13.10.2020 |
  • Javier Gamazo - remover-partial-convolutions), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zzh8829/yolov3-tf2)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=_dRjY9gMcxE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JDpH8MAjaKoekQ_H9ZaxYJ9_axiDtDGm) | 22.08.2020 |
  • Bolei Zhou - segmentation-pytorch?style=social)](https://github.com/CSAILVision/semantic-segmentation-pytorch) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1608.05442), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1612.01105), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.10221), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.04514)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/CSAILVision/sceneparsing), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vacancy/Synchronized-BatchNorm-PyTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hszhao/semseg)</li><li>[project](http://sceneparsing.csail.mit.edu/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CSAILVision/semantic-segmentation-pytorch/blob/master/notebooks/DemoSegmenter.ipynb) | 21.08.2020 |
  • Dan Holtmann-Rice - config?style=social)](https://github.com/google/gin-config) <ul><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/stop-worrying-about-configs-with-gin-218562dd5c91)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/gin-config/blob/master/gin/gin_intro.ipynb) | 13.08.2020 |
  • Pablo Castro - 2.0.html)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://google.github.io/dopamine/docker/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://google.github.io/dopamine/docs/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/atari-py#roms), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/mujoco-py#install-mujoco)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/the-21st-century/google-dopamine-new-rl-framework-f84a35b7fb3f)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/FWFoyFjeAaM?feature=share), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bd4CsDp00RA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/dopamine/blob/master/dopamine/colab/jax_agent_visualizer.ipynb) | 03.08.2020 |
  • Dale Markowitz - learning-for-sports)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://manivannan-ai.medium.com/find-the-angle-between-three-points-from-2d-using-python-348c513e2cd)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=yLrOy2Xedgk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/making_with_ml/blob/master/sports_ai/Sports_AI_Analysis.ipynb) | 14.07.2020 |
  • Alexey Bochkovskiy - the-most-accurate-real-time-neural-network-on-ms-coco-dataset-73adfd3602fe), [<img src="images/medium.svg" alt="medium" height=20/>](https://alexeyab84.medium.com/scaled-yolo-v4-is-the-best-neural-network-for-object-detection-on-ms-coco-dataset-39dfa22fa982)</li><li>[project](https://pjreddie.com/darknet/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/gydxzd/p_yolov4_the_most_accurate_realtime_neural/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1_SiUOYUoOI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YDFf-TqJOFE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_GdoqCJWXsChrOiY8sZMr_zbr_fH-0Fg) | 25.06.2020 |
  • Julien Valentin - keskin)</li> <li>[Pavel Pidlypenskyi](https://github.com/podlipensky)</li> <li>[Ameesh Makadia](https://github.com/amakadia)</li><details><summary>others</summary><li>[Avneesh Sud](https://github.com/avneesh-g)</li> <li>[Sofien Bouaziz](http://sofienbouaziz.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450508.3464595)](https://doi.org/10.1145/3450508.3464595) [![](https://img.shields.io/github/stars/tensorflow/graphics?style=social)](https://github.com/tensorflow/graphics) <ul><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/computer-graphics-computer-vision-tensorflow-graphics-110e955e26bb)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/graphic)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/_TFGraphics_)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Un0JDL3i5Hg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/graphics/blob/master/tensorflow_graphics/notebooks/6dof_alignment.ipynb) | 20.05.2020 |
  • David Bau - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Hendrik Strobelt](http://hendrik.strobelt.com/)</li> <li>[Bolei Zhou](https://boleizhou.github.io/)</li><details><summary>others</summary><li>[Joshua Tenenbaum](https://mitibmwatsonailab.mit.edu/people/joshua-tenenbaum/)</li> <li>[William Freeman](https://billf.mit.edu/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li></ul></details> | [![](https://img.shields.io/github/stars/CSAILVision/GANDissect?style=social)](https://github.com/CSAILVision/GANDissect) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.10597), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1901.09887), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.10221)</li><li>[demo](http://gandissect.res.ibm.com/ganpaint.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/CSAILVision/NetDissect), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/junyanz/iGAN)</li><li>[project](https://gandissect.csail.mit.edu/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=yVCgUYe4JTM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/SIDN-IAP/global-model-repr/blob/master/notebooks/gandissect_solutions.ipynb) | 04.05.2020 |
  • Malcolm Reynolds - jrae)</li> <li>[Andreas Fidjeland](https://github.com/akfidjeland)</li> <li>[Fabio Viola](https://github.com/fabioviola)</li><details><summary>others</summary><li>[Adrià Puigdomènech](https://github.com/adria-p)</li> <li>[Frederic Besse](https://github.com/fbesse)</li> <li>[Tim Green](http://tfgg.me/)</li> <li>[Sébastien Racanière](https://scholar.google.com/citations?user=o-h0vrQAAAAJ)</li> <li>[Gabriel Barth-Maron](https://github.com/fastturtle)</li> <li>[Diego Casas](https://github.com/diegolascasas)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/sonnet?style=social)](https://github.com/deepmind/sonnet) <ul><li>[blog post](https://www.deepmind.com/blog/open-sourcing-sonnet-a-new-library-for-constructing-neural-networks)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://sonnet.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2016/hash/fb87582825f9d28a8d42c5e5e5e8b23d-Abstract.html)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/checkpoint), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/saved_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rlpQjnUvoKw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/sonnet/blob/v2/examples/little_gan_on_mnist.ipynb) | 17.04.2020 |
  • tmoneyx01 - client-py?style=social)](https://github.com/mdai/mdai-client-py) <ul><li>[annotator](https://public.md.ai/annotator/project/PVq9raBJ)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.md.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mdai/ml-lessons/blob/master/lesson1-xray-images-classification.ipynb) | 07.03.2020 |
  • Qiusheng Wu - visualization/folium?style=social)](https://github.com/python-visualization/folium) <ul><li>[api](https://developers.google.com/earth-engine/python_install)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/giswqs/qgis-earthengine-examples/blob/master/Folium/ee-api-folium-setup.ipynb) | 20.01.2020 |
  • Ashish Vaswani - MdPcAAAAJ)</li> <li>[Eugene Brevdo](https://ebrevdo.github.io/)</li> <li>[François Chollet](https://fchollet.com/)</li><details><summary>others</summary><li>[Aidan Gomez](https://gom.ai/)</li> <li>[Stephan Gouws](https://scholar.google.com/citations?user=lLTdYUYAAAAJ)</li> <li>[Llion Jones](https://www.linkedin.com/in/llion-jones-9ab3064b)</li> <li>[Łukasz Kaiser](https://scholar.google.com/citations?user=JWmiQR0AAAAJ)</li> <li>[Nal Kalchbrenner](https://www.nal.ai/)</li> <li>[Niki Parmar](https://github.com/nikiparmar)</li> <li>[Ryan Sepassi](https://ryansepassi.com/)</li> <li>[Noam Shazeer](https://github.com/nshazeer)</li> <li>[Jakob Uszkoreit](https://scholar.google.com/citations?user=mOG0bwsAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/tensorflow/tensor2tensor?style=social)](https://github.com/tensorflow/tensor2tensor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1803.07416), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1812.02825), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.03762), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.03059), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.05137), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.09797)</li><li>[blog post](https://ai.googleblog.com/2017/06/accelerating-deep-learning-research.html)</li><li>[data](https://research.fb.com/downloads/babi/)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/tensor2tensor-and-one-model-to-learn-them-all-7ef3f9b61ba4)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.github.io/tensor2tensor/cloud_mlengine.html), [<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.github.io/tensor2tensor/cloud_tpu.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/O2UvKxaOH7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VYQ8n3Besrw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/cS2UZKHq4i4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tensor2tensor/blob/master/tensor2tensor/notebooks/Transformer_translate.ipynb) | 14.01.2020 |
  • Andrey Nikishaev - learning-world/tutorial-making-road-traffic-counting-app-based-on-computer-vision-and-opencv-166937911660)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=_o5iLbRHKao)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12N4m_RYKqrpozRzh9qe7nQE_sIqQH9U8) | 10.01.2020 |
  • Tanuj Jain - tran.com/)</li></ul> | [![](https://img.shields.io/github/stars/idealo/imagededup?style=social)](https://github.com/idealo/imagededup) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1704.04861)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://fullstackml.com/wavelet-image-hash-in-python-3504fdd282b5)</li><li>[project](https://idealo.github.io/imagededup/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/idealo/imagededup/blob/master/examples/CIFAR10_duplicates.ipynb) | 03.10.2019 |
  • Billy Lamberta - Or](https://danielcohenor.com/)</li> <li>[Adam Roberts](https://github.com/adarob)</li> <li>[Jesse Engel](https://github.com/jesseengel)</li> <li>[Google](https://www.tensorflow.org/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li> <li>[Curtis Hawthorne](https://github.com/cghawthorne)</li> <li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li> <li>[Yuval Alaluf](https://yuval-alaluf.github.io/)</li> <li>[Xintao Wang](https://xinntao.github.io/)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li> <li>[Patrick Esser](https://github.com/pesser)</li> <li>[Robin Rombach](https://github.com/rromb)</li> <li>[Or Patashnik](https://orpatashnik.github.io/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li> <li>[Bolei Zhou](https://boleizhou.github.io/)</li> <li>[Krzysztof Ostrowski](https://github.com/krzys-ostrowski)</li> <li>[Max Woolf](https://minimaxir.com/)</li> <li>[Jon Barron](https://jonbarron.info/)</li> <li>[Xiaohua Zhai](https://github.com/xiaohuazhai)</li> <li>[Ishan Misra](https://imisra.github.io/)</li> <li>[Nikhila Ravi](https://nikhilaravi.com/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li> <li>[Karen Simonyan](https://scholar.google.com/citations?user=L7lMQkQAAAAJ)</li> <li>[Amit Bermano](https://www.cs.tau.ac.il/~amberman/)</li> <li>[Jun-Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Phil Wang](https://lucidrains.github.io/)</li> <li>[Ben Trevett](https://bentrevett.com/)</li></ul> | <ul><li>tensorflow/models [![](https://img.shields.io/github/stars/tensorflow/models?style=social)](https://github.com/tensorflow/models)</li> <li>CompVis/stable-diffusion [![](https://img.shields.io/github/stars/CompVis/stable-diffusion?style=social)](https://github.com/CompVis/stable-diffusion)</li> <li>openai/whisper [![](https://img.shields.io/github/stars/openai/whisper?style=social)](https://github.com/openai/whisper)</li> <li>CorentinJ/Real-Time-Voice-Cloning [![](https://img.shields.io/github/stars/CorentinJ/Real-Time-Voice-Cloning?style=social)](https://github.com/CorentinJ/Real-Time-Voice-Cloning)</li> <li>ultralytics/yolov5 [![](https://img.shields.io/github/stars/ultralytics/yolov5?style=social)](https://github.com/ultralytics/yolov5)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>iperov/DeepFaceLab [![](https://img.shields.io/github/stars/iperov/DeepFaceLab?style=social)](https://github.com/iperov/DeepFaceLab)</li> <li>facebookresearch/segment-anything [![](https://img.shields.io/github/stars/facebookresearch/segment-anything?style=social)](https://github.com/facebookresearch/segment-anything)</li> <li>jakevdp/PythonDataScienceHandbook [![](https://img.shields.io/github/stars/jakevdp/PythonDataScienceHandbook?style=social)](https://github.com/jakevdp/PythonDataScienceHandbook)</li> <li>LAION-AI/Open-Assistant [![](https://img.shields.io/github/stars/LAION-AI/Open-Assistant?style=social)](https://github.com/LAION-AI/Open-Assistant)</li> <li>Stability-AI/stablediffusion [![](https://img.shields.io/github/stars/Stability-AI/stablediffusion?style=social)](https://github.com/Stability-AI/stablediffusion)</li> <li>XingangPan/DragGAN [![](https://img.shields.io/github/stars/XingangPan/DragGAN?style=social)](https://github.com/XingangPan/DragGAN)</li> <li>microsoft/visual-chatgpt [![](https://img.shields.io/github/stars/microsoft/visual-chatgpt?style=social)](https://github.com/microsoft/visual-chatgpt)</li> <li>TencentARC/GFPGAN [![](https://img.shields.io/github/stars/TencentARC/GFPGAN?style=social)](https://github.com/TencentARC/GFPGAN)</li> <li>lllyasviel/Fooocus [![](https://img.shields.io/github/stars/lllyasviel/Fooocus?style=social)](https://github.com/lllyasviel/Fooocus)</li> <li>google-research/google-research [![](https://img.shields.io/github/stars/google-research/google-research?style=social)](https://github.com/google-research/google-research)</li> <li>suno-ai/bark [![](https://img.shields.io/github/stars/suno-ai/bark?style=social)](https://github.com/suno-ai/bark)</li> <li>ray-project/ray [![](https://img.shields.io/github/stars/ray-project/ray?style=social)](https://github.com/ray-project/ray)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>facebookresearch/fairseq [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq)</li> <li>coqui-ai/TTS [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS)</li> <li>facebookresearch/detectron2 [![](https://img.shields.io/github/stars/facebookresearch/detectron2?style=social)](https://github.com/facebookresearch/detectron2)</li> <li>google/jax [![](https://img.shields.io/github/stars/google/jax?style=social)](https://github.com/google/jax)</li> <li>xinntao/Real-ESRGAN [![](https://img.shields.io/github/stars/xinntao/Real-ESRGAN?style=social)](https://github.com/xinntao/Real-ESRGAN)</li> <li>deezer/spleeter [![](https://img.shields.io/github/stars/deezer/spleeter?style=social)](https://github.com/deezer/spleeter)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>svc-develop-team/so-vits-svc [![](https://img.shields.io/github/stars/svc-develop-team/so-vits-svc?style=social)](https://github.com/svc-develop-team/so-vits-svc)</li> <li>huggingface/diffusers [![](https://img.shields.io/github/stars/huggingface/diffusers?style=social)](https://github.com/huggingface/diffusers)</li> <li>openai/CLIP [![](https://img.shields.io/github/stars/openai/CLIP?style=social)](https://github.com/openai/CLIP)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>AlexeyAB/darknet [![](https://img.shields.io/github/stars/AlexeyAB/darknet?style=social)](https://github.com/AlexeyAB/darknet)</li> <li>openai/gpt-2 [![](https://img.shields.io/github/stars/openai/gpt-2?style=social)](https://github.com/openai/gpt-2)</li></ul> | <ul><li>AlphaFold [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2)</li> <li>MoCo [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00975)](https://doi.org/10.1109/CVPR42600.2020.00975)</li> <li>EfficientDet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.01079)](https://doi.org/10.1109/CVPR42600.2020.01079)</li> <li>DeepLabCut [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41593-018-0209-y)](https://doi.org/10.1038/s41593-018-0209-y)</li> <li>StyleGAN 2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00813)](https://doi.org/10.1109/CVPR42600.2020.00813)</li> <li>Fine-tuning a BERT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/N19-1423)](https://doi.org/10.18653/v1/N19-1423)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>Neural Style Transfer [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1167/16.12.326)](https://doi.org/10.1167/16.12.326)</li> <li>PIFu [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV.2019.00239)](https://doi.org/10.1109/ICCV.2019.00239)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li> <li>VIBE [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00530)](https://doi.org/10.1109/CVPR42600.2020.00530)</li> <li>InterFaceGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00926)](https://doi.org/10.1109/CVPR42600.2020.00926)</li> <li>Pixel2Style2Pixel [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00232)](https://doi.org/10.1109/CVPR46437.2021.00232)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>PIFuHD [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00016)](https://doi.org/10.1109/CVPR42600.2020.00016)</li> <li>Nerfies [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00581)](https://doi.org/10.1109/ICCV48922.2021.00581)</li> <li>Skillful Precipitation Nowcasting Using Deep Generative Models of Radar [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03854-z)](https://doi.org/10.1038/s41586-021-03854-z)</li> <li>Parallel WaveGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP40776.2020.9053795)](https://doi.org/10.1109/ICASSP40776.2020.9053795)</li> <li>BiT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58558-7_29)](https://doi.org/10.1007/978-3-030-58558-7_29)</li> <li>encoder4editing [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459838)](https://doi.org/10.1145/3450626.3459838)</li> <li>Wav2Lip [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3394171.3413532)](https://doi.org/10.1145/3394171.3413532)</li> <li>SeFa [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00158)](https://doi.org/10.1109/CVPR46437.2021.00158)</li> <li>Cleanlab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1613/jair.1.12125)](https://doi.org/10.1613/jair.1.12125)</li> <li>CartoonGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR.2018.00986)](https://doi.org/10.1109/CVPR.2018.00986)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>TediGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00229)](https://doi.org/10.1109/CVPR46437.2021.00229)</li> <li>ReStyle [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00664)](https://doi.org/10.1109/ICCV48922.2021.00664)</li> <li>StyleGAN-NADA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530164)](https://doi.org/10.1145/3528223.3530164)</li> <li>3D Photo Inpainting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00805)](https://doi.org/10.1109/CVPR42600.2020.00805)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li></ul> |
  • awesome-colab-notebooks - colab-notebooks.svg)](https://starchart.cc/amrzv/awesome-colab-notebooks)
Programming Languages