Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

awesome-colab-notebooks

Collection of google colaboratory notebooks for fast and easy experiments
https://github.com/amrzv/awesome-colab-notebooks

Last synced: about 3 hours ago
JSON representation

  • Research

    • Deyao Zhu - shen.github.io/)</li> <li>[Xiang Li](https://xiangli.ac.cn/)</li> <li>[Mohamed Elhoseiny](https://www.mohamed-elhoseiny.com/)</li></ul> | [![](https://img.shields.io/github/stars/Vision-CAIR/MiniGPT-4?style=social)](https://github.com/Vision-CAIR/MiniGPT-4) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.10592)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/Vision-CAIR/cc_sbu_align), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Vision-CAIR/MiniGPT-4)</li><li>[project](https://minigpt-4.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/__tftoxpBAw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wUONNv7guXI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hNAFuuXYL58), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SAjrpYjx0ps)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1OK4kYsZphwt5DXchKkzMBjYF6jnkqh4R) | 23.04.2023 |
    • Shangchen Zhou - chongyi.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://img.shields.io/github/stars/sczhou/CodeFormer?style=social)](https://github.com/sczhou/CodeFormer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.11253)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/samb-t/unleashing-transformers), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepcam-cn/yolov5-face), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2022/hash/c573258c38d0a3919d8c1364053c45df-Abstract-Conference.html)</li><li>[project](https://shangchenzhou.com/projects/CodeFormer/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/d3VDpkXlueI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PtwWu-FugbA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ORtYP8NW4T0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xc5lKOKBCcg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1m52PNveE4PBhYrecj34cnpEeiHcC5LTb) | 21.04.2023 |
    • Somshubra Majumdar - badge.php?doi=10.1167/16.12.326)](https://doi.org/10.1167/16.12.326) [![](https://img.shields.io/github/stars/titu1994/Neural-Style-Transfer?style=social)](https://github.com/titu1994/Neural-Style-Transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1508.06576), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1605.04603), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.05897)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/titu1994/Neural-Style-Transfer/blob/master/NeuralStyleTransfer.ipynb) | 22.01.2021 |
    • Prakruti Joshi - music-theory.html)</li><li>[musicXML](https://www.musicxml.com/for-developers/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/MusicXML_Document_Structure_Documentation.ipynb) | 08.01.2021 |
    • Raphael Gontijo Lopes - badge.php?doi=10.1109/ICCV.2019.00802)](https://doi.org/10.1109/ICCV.2019.00802) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.02632)</li><li>[blog post](https://magenta.tensorflow.org/svg-vae)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/vae_svg_decoding.ipynb) | 08.01.2021 |
    • Chen Gao - Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42) [![](https://img.shields.io/github/stars/vt-vl-lab/FGVC?style=social)](https://github.com/vt-vl-lab/FGVC) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2009.01835)</li><li>[project](http://chengao.vision/FGVC/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=CHHVPxHT7rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1pb6FjWdwq_q445rG2NP0dubw7LKNUkqc) | 30.12.2020 |
    • Erik Härkönen - pretrained-stylegan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CSAILVision/GANDissect)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/6fe43269967adbb64ec6149852b5cc3e-Abstract.html)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/jdTICDa_eAI), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/oIzwe_MOeQI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/harskish/ganspace/blob/master/notebooks/Ganspace_colab.ipynb) | 06.12.2020 |
    • Woosung Choi - badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896) [![](https://img.shields.io/github/stars/ws-choi/Conditioned-Source-Separation-LaSAFT?style=social)](https://github.com/ws-choi/Conditioned-Source-Separation-LaSAFT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11631)</li><li>[data](https://sigsep.github.io/datasets/musdb.html)</li><li>[project](https://lasaft.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ws-choi/Conditioned-Source-Separation-LaSAFT/blob/master/colab_demo/LaSAFT_with_GPoCM_Stella_Jang_Example.ipynb) | 01.11.2020 |
    • shaoanlu - GAN?style=social)](https://github.com/shaoanlu/faceswap-GAN) | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/shaoanlu/faceswap-GAN/blob/master/colab_demo/faceswap-GAN_colab_demo.ipynb) | 12.09.2020 |
    • Jheng-Wei Su - badge.php?doi=10.1109/CVPR42600.2020.00799)](https://doi.org/10.1109/CVPR42600.2020.00799) [![](https://img.shields.io/github/stars/ericsujw/InstColorization?style=social)](https://github.com/ericsujw/InstColorization) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.10825)</li><li>[project](https://ericsujw.github.io/InstColorization/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Zj1N4uE1ehk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ericsujw/InstColorization/blob/master/InstColorization.ipynb) | 30.08.2020 |
    • Vincent Sitzmann - hw7FJOEUK1tX7mdp8SKB368K)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/53c04118df112c13a8c34b38343b9c10-Abstract.html)</li><li>[project](https://vsitzmann.github.io/siren/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=Q2fLWGBeaiI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vsitzmann/siren/blob/master/explore_siren.ipynb) | 24.06.2020 |
    • Xu Yao - wiki/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/vadimkantorov/caffemodel2pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/InterDigitalInc/HRFAE/blob/master/test.ipynb) | 14.05.2020 |
    • Christine McLeavey
    • Meng-Li Shih - Yang Su](https://lemonatsu.github.io/)</li> <li>[Johannes Kopf](https://johanneskopf.de/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00805)](https://doi.org/10.1109/CVPR42600.2020.00805) [![](https://img.shields.io/github/stars/vt-vl-lab/3d-photo-inpainting?style=social)](https://github.com/vt-vl-lab/3d-photo-inpainting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.04727)</li><li>[project](https://shihmengli.github.io/3D-Photo-Inpainting/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1706ToQrkIZshRSJSHvZ1RuCiM__YX3Bz) | 04.05.2020 |
    • Yurui Ren - yu.github.io/)</li> <li>[Junming Chen](https://github.com/R-JunmingChen)</li> <li>[Thomas Li](https://ieeexplore.ieee.org/author/37086497292)</li> <li>[Ge Li](https://ieeexplore.ieee.org/author/37085815762)</li></ul> | [![](https://img.shields.io/github/stars/RenYurui/Global-Flow-Local-Attention?style=social)](https://github.com/RenYurui/Global-Flow-Local-Attention) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.00696), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1605.03557)</li><li>[data](https://shapenet.org/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ondyari/FaceForensics), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/vid2vid), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tengteng95/Pose-Transfer)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://gt3rs.medium.com/compile-with-nvcc-3566fbdfdbf)</li><li>[project](https://renyurui.github.io/GFLA-web/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/Ju0hBzCwsyU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/RenYurui/Global-Flow-Local-Attention/blob/master/demo.ipynb) | 30.04.2020 |
    • Curtis Hawthorne - frames)</li><li>[data](https://g.co/magenta/maestro-wave2midi2wave), [data](https://magenta.tensorflow.org/datasets/e-gmd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/onsets_frames_transcription/onsets_frames_transcription.ipynb) | 02.04.2020 |
    • Sourabh Bajaj - badge.php?doi=10.18653/v1/N19-1423)](https://doi.org/10.18653/v1/N19-1423) <ul><li>[TPU quickstart](https://cloud.google.com/tpu/docs/quickstart)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.04805)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/hub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tpu/blob/master/tools/colab/bert_finetuning_with_cloud_tpus.ipynb) | 29.03.2019 |
    • Yupeng Zhou - Ming Cheng](https://mmcheng.net/cmm/)</li> <li>[Jiashi Feng](https://sites.google.com/site/jshfeng/?pli=1)</li> <li>[Qibin Hou](https://houqb.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/HVision-NKU/StoryDiffusion?style=social)](https://github.com/HVision-NKU/StoryDiffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2405.01434)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://youtu.be/GeNyP4VY9rE?si=qW1jcW_GbKutmKQv)</li><li>[project](https://storydiffusion.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StoryDiffusion/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jZWRENqCl6I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GeNyP4VY9rE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/HVision-NKU/StoryDiffusion/blob/main/Comic_Generation.ipynb) | 04.05.2024 |
    • Fitsum Reda - badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15) [![](https://img.shields.io/github/stars/google-research/frame-interpolation?style=social)](https://github.com/google-research/frame-interpolation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.04901)</li><li>[data](http://data.csail.mit.edu/tofu/testset/vimeo_interp_test.zip), [data](https://vision.middlebury.edu/flow/data), [data](https://people.cs.umass.edu/~hzjiang/projects/superslomo/UCF101_results.zip)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sniklaus/softmax-splatting/blob/master/benchmark.py)</li><li>[project](https://film-net.github.io/)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/tutorials/load_data/tfrecord), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/train/Example), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/saved_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OAD-BieIjH4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sK0uc-GJxmdnaxHhYqD2afRknakpdTNZ) | 03.05.2024 |
    • Pablo Pernias - christopher-j)</li> <li>[Marc Aubreville](https://lme.tf.fau.de/person/aubreville/)</li></ul> | [![](https://img.shields.io/github/stars/dome272/wuerstchen?style=social)](https://github.com/dome272/wuerstchen) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.00637)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/wuerstchen)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/16hsklt/w%C3%BCrstchen_is_here_a_game_changing_fastest/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ogJsCPqgFMk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/dome272/Wuerstchen/blob/main/w%C3%BCrstchen-stage-C.ipynb) | 06.04.2024 |
    • Fangzhou Hong - badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094) [![](https://img.shields.io/github/stars/hongfz16/AvatarCLIP?style=social)](https://github.com/hongfz16/AvatarCLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.08535), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01455), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03221), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.05139), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13333)</li><li>[data](https://www.di.ens.fr/willow/research/surreal/data/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/daniilidis-group/neural_renderer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/GuyTevet/MotionCLIP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Totoro97/NeuS), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nghorbani/human_body_prior)</li><li>[project](https://hongfz16.github.io/projects/AvatarCLIP.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-l2ZMeoASGY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dfaecX7xF3nP6fyXc8XBljV5QY1lc1TR) | 15.05.2022 |
    • Alexey Dosovitskiy - research/vision_transformer?style=social)](https://github.com/google-research/vision_transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11929), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.10270), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01548), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.07991), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.08065)</li><li>[blog post](https://blog.research.google/2022/04/locked-image-tuning-adding-language.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/flaxformer)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/models)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@weiwen21/an-image-is-worth-16x16-words-transformers-for-image-recognition-at-scale-957f88e53726)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TrdevFK_am4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HZ4j_U3FC94), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7K4Z8RqjWIk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oDtcobGQ7xU?si=C2EgZTESzhTXFSq6), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/v6xj_DG-UEo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/vision_transformer/blob/main/vit_jax.ipynb) | 06.02.2024 |
    • Haohe Liu - yuan)</li> <li>[Xinhao Mei](https://xinhaomei.github.io/)</li><details><summary>others</summary><li>[Xubo Liu](https://liuxubo717.github.io/)</li> <li>[Danilo Mandic](https://www.imperial.ac.uk/people/d.mandic)</li> <li>[Wenwu Wang](http://personal.ee.surrey.ac.uk/Personal/W.Wang/)</li> <li>[Mark Plumbley](https://www.surrey.ac.uk/people/mark-plumbley)</li></ul></details> | [![](https://img.shields.io/github/stars/haoheliu/AudioLDM?style=social)](https://github.com/haoheliu/AudioLDM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.12503)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/LAION-AI/CLAP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CompVis/stable-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/toshas/torch-fidelity)</li><li>[project](https://audioldm.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_0VTltNYhao)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/olaviinha/NeuralTextToAudio/blob/main/AudioLDM_pub.ipynb) | 02.12.2023 |
    • Jinze Bai - VL?style=social)](https://github.com/QwenLM/Qwen-VL) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.12966), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.09685), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.14314)</li><li>[demo](https://modelscope.cn/studios/qwen/Qwen-VL-Chat-Demo/summary)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/z3GAxXZ9Ce)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BradyFU/Awesome-Multimodal-Large-Language-Models/tree/Evaluation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OFA-Sys/TouchStone), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PanQiWei/AutoGPTQ)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/AILab-CVC/SEED-Bench_Leaderboard), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Qwen/Qwen-VL)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ElrSJDg23Po), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/E3MS8GfGWj4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ju09YaO7BGA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Qwen-VL-Chat-colab/blob/main/Qwen_VL_Chat_colab.ipynb) | 24.11.2023 |
    • Zhaoshuo Li - Yu Liu](https://mingyuliu.net/)</li> <li>[Chen-Hsuan Lin](https://chenhsuanlin.bitbucket.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/NVlabs/neuralangelo?style=social)](https://github.com/NVlabs/neuralangelo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.03092)</li><li>[blog post](https://blogs.nvidia.com/blog/2023/06/01/neuralangelo-ai-research-3d-reconstruction/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mli0603/BlenderNeuralangelo)</li><li>[project](https://research.nvidia.com/labs/dir/neuralangelo/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PQMNCXR-WF8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qpdw3SW54kI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/lC2uPDfaTcE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/13u8DX9BNzQwiyPPCB7_4DbSxiQ5-_nGF) | 27.08.2023 |
    • Chaoning Zhang - Ho Bae](https://scholar.google.com/citations?user=EULut5oAAAAJ)</li> <li>[Seungkyu Lee](https://scholar.google.com/citations?user=3Pf6C6cAAAAJ)</li> <li>[Choong Seon Hong](https://scholar.google.com/citations?user=oKANWloAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/ChaoningZhang/MobileSAM?style=social)](https://github.com/ChaoningZhang/MobileSAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.14289)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jolibrain/joliGEN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/akbartus/MobileSAM-in-the-Browser), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/qiaoyu1002/Inpaint-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/qiaoyu1002/Personalize-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Jumpat/SegmentAnythingin3D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vietanhdev/anylabeling), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/wangsssky/SonarSAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/continue-revolution/sd-webui-segment-anything)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/_akhaliq/status/1674410573075718145)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eTEfq_kWabQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ChaoningZhang/MobileSAM/blob/master/notebooks/predictor_example.ipynb) | 30.06.2023 |
    • Xinyin Ma - colab/blob/main/DeepCache_colab.ipynb) | 18.12.2023 |
    • Xinqi Lin - of-diffusers)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1-base)</li><li>[project](https://0x3f3f3f3fun.github.io/projects/diffbir/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rGnrpxWjBOg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MIRiJGuGqsg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DiffBIR-colab/blob/main/DiffBIR_colab.ipynb) | 18.12.2023 |
    • Gang Liu - badge.php?doi=10.1587/transinf.2023EDP7061)](http://doi.org/10.1587/transinf.2023EDP7061) [![](https://img.shields.io/github/stars/TachibanaYoshino/AnimeGANv3?style=social)](https://github.com/TachibanaYoshino/AnimeGANv3) <ul><li>[project](https://tachibanayoshino.github.io/AnimeGANv3/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/EosubeJmAnE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5qLUflWb45E), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iFjiaPlhVm4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/vJqQQMRYKh0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0KaScDxgyBw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6WXhjXb5a-o)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1XYNWwM8Xq-U7KaTOqNap6A-Yq1f-V-FB) | 23.11.2023 |
    • Xubo Liu - yuan)</li> <li>[Yuzhuo Liu](https://github.com/redrabbit94)</li> <li>[Rui Xia](https://scholar.google.co.uk/citations?user=26oErxwAAAAJ)</li> <li>[Yuxuan Wang](https://scholar.google.com/citations?user=3RaOfJkAAAAJ)</li> <li>[Mark Plumbley](https://www.surrey.ac.uk/people/mark-plumbley)</li> <li>[Wenwu Wang](http://personal.ee.surrey.ac.uk/Personal/W.Wang/)</li></ul></details> | [![](https://img.shields.io/github/stars/Audio-AGI/AudioSep?style=social)](https://github.com/Audio-AGI/AudioSep) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.05037)</li><li>[project](https://audio-agi.github.io/Separate-Anything-You-Describe/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Audio-AGI/AudioSep/blob/main/AudioSep_Colab.ipynb) | 12.10.2023 |
    • Alexander Groshev - kuznetsov-70ab12127)</li> <li>[Denis Dimitrov](https://github.com/denndimitrov)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ACCESS.2022.3196668)](https://doi.org/10.1109/ACCESS.2022.3196668) [![](https://img.shields.io/github/stars/ai-forever/ghost?style=social)](https://github.com/ai-forever/ghost) <ul><li>[blog post](https://habr.com/ru/company/sberbank/blog/645919/)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/vgg_face/)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/wawa9000/ghost)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1vXTpsENipTmjTMggwveCkXASwxUk270n) | 22.08.2023 |
    • Xudong Wang
    • Matthias Minderer - badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.06230)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/owlvit)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/zeroshot_object_detection_with_owlvit.ipynb) | 21.08.2023 |
    • Xinyu Huang - nju)</li><details><summary>others</summary><li>[Yanchun Xie](https://scholar.google.com/citations?user=T0xk9-wAAAAJ)</li> <li>[Yuzhuo Qin](https://scholar.google.com/citations?user=5ZG65AkAAAAJ)</li> <li>[Tong Luo](https://ieeexplore.ieee.org/author/37089387319)</li> <li>[Yaqian Li](https://openreview.net/profile?id=~Yaqian_Li1)</li> <li>[Yandong Guo](http://www.lsl.zone/)</li> <li>[Yandong Guo](https://scholar.google.com/citations?user=fWDoWsQAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/xinyu1205/recognize-anything?style=social)](https://github.com/xinyu1205/recognize-anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.03514), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05657)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenGVLab/Ask-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/positive666/Prompt-Can-Anything)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://artgor.medium.com/paper-review-recognize-anything-a-strong-image-tagging-model-9e5e1c6dd0af)</li><li>[project](https://recognize-anything.github.io/), [project](https://recognize-anything.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mhd-medfa/recognize-anything/blob/main/recognize_anything_demo.ipynb) | 09.07.2023 |
    • Xingang Pan - inf.mpg.de/~tleimkue/)</li> <li>[Lingjie Liu](https://lingjie0206.github.io/)</li><details><summary>others</summary><li>[Abhimitra Meka](https://www.meka.page/)</li> <li>[Christian Theobalt](https://people.mpi-inf.mpg.de/~theobalt/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500) [![](https://img.shields.io/github/stars/XingangPan/DragGAN?style=social)](https://github.com/XingangPan/DragGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.10973)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3#requirements)</li><li>[project](https://vcai.mpi-inf.mpg.de/projects/DragGAN/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/XingangP)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1mey-IXPwQC_qSthI5hO-LTX7QL4ivtPh) | 03.07.2023 |
    • Kunchang Li - X/UniFormer?style=social)](https://github.com/Sense-X/UniFormer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.04676), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.09450), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.10858), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17239)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/zihangJiang/TokenLabeling), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/deit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fvcore), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookincubator/submitit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SlowFast), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Swin-Transformer-Object-Detection), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/whai362/PVT/tree/v2/segmentation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HRNet/HRFormer/tree/main/pose)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Sense-X/uniformer_image_demo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Sense-X/uniformer_video_demo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Andy1621/uniformer_image_detection), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Andy1621/uniformer_image_segmentation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmsegmentation/blob/master/demo/MMSegmentation_Tutorial.ipynb) | 31.03.2023 |
    • Shunsuke Saito - IvjMAAAAJ)</li> <li>[Hanbyul Joo](https://jhugestar.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00016)](https://doi.org/10.1109/CVPR42600.2020.00016) [![](https://img.shields.io/github/stars/facebookresearch/pifuhd?style=social)](https://github.com/facebookresearch/pifuhd) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.00452)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uEDqCxvF5yc), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=8qnwbbDS8xk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11z58bl3meSzo6kFqkahMa35G5jmh2Wgt) | 26.03.2023 |
    • Tao Yang - pytorch)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yangxy/GPEN/blob/main/GPEN.ipynb) | 15.02.2023 |
    • Fabian-Robert Stöter - badge.php?doi=10.21105/joss.01667)](https://doi.org/10.21105/joss.01667) [![](https://img.shields.io/github/stars/sigsep/open-unmix-pytorch?style=social)](https://github.com/sigsep/open-unmix-pytorch) <ul><li>[data](https://sigsep.github.io/datasets/musdb.html#musdb18-compressed-stems)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sigsep/norbert)</li><li>[<img src="images/paperswithcode.svg" alt="paperswithcode" height=20/>](https://paperswithcode.com/sota/music-source-separation-on-musdb18?p=open-unmix-a-reference-implementation-for)</li><li>[project](https://sigsep.github.io/open-unmix/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/playlist?list=PLhA3b2k8R3t0VpYCpCTU2B1h604rvnV4N)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1mijF0zGWxN-KaxTnd0q6hayAlrID5fEQ) | 09.02.2023 |
    • Jon Gillick - datasets)</li><li>[web app](https://groove-drums.glitch.me/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=x2YLmXzovDo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/magenta-demos/blob/master/colab-notebooks/GrooVAE.ipynb) | 01.02.2023 |
    • Chengyi Wang - chen.github.io/)</li> <li>[Yu Wu](https://www.microsoft.com/en-us/research/people/yuwu1/)</li> <li>[Ziqiang Zhang](https://github.com/zz12375)</li><details><summary>others</summary><li>[Long Zhou](https://long-zhou.github.io/)</li> <li>[Shujie Liu](https://www.microsoft.com/en-us/research/people/shujliu/)</li> <li>[Zhuo Chen](https://www.microsoft.com/en-us/research/people/zhuc/)</li> <li>[Yanqing Liu](https://scholar.google.com/citations?user=dIJFz4UAAAAJ)</li> <li>[Huaming Wang](https://scholar.google.com/citations?user=aJDLg5IAAAAJ)</li> <li>[Jinyu Li](https://www.microsoft.com/en-us/research/people/jinyli/)</li> <li>[Lei He](https://scholar.google.com/citations?user=EKl9yY8AAAAJ)</li> <li>[Sheng Zhao](https://scholar.google.com/citations?user=689bIIwAAAAJ)</li> <li>[Furu Wei](https://www.microsoft.com/en-us/research/people/fuwei/)</li></ul></details> | [![](https://img.shields.io/github/stars/enhuiz/vall-e?style=social)](https://github.com/enhuiz/vall-e) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed#requirements)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://vidrihmarko.medium.com/mind-blowing-vall-e-neural-codec-language-models-are-zero-shot-text-to-speech-synthesizers-f002560ecd6)</li><li>[project](https://valle-demo.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/104ixvi/r_neural_codec_language_models_are_zeroshot_text/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/F6HSsVIkqIU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZehhrrQGmt4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-3MPZxRxvV4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ha2WjP7zfno)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1wEze0kQ0gt9B3bQmmbtbSXCoCTpq5vg-) | 18.01.2023 |
    • Hongwen Zhang - DensePose2SMPL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/DensePose), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Microsoft/human-pose-estimation.pytorch)</li><li>[project](https://www.liuyebin.com/pymaf-x/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/yqEmznSKjYI), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/ylOB0wCeV34)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11RXLsH9BdoSCwY6G-IX7KgqDxVoImu6K) | 06.10.2022 |
    • Alhussein Fawzi - Paredes](https://sites.google.com/site/romeraparedes/)</li> <li>[Mohammadamin Barekatain](http://barekatain.me/)</li> <li>[Alexander Novikov](https://scholar.google.com/citations?user=jMUkLqwAAAAJ)</li> <li>[Francisco Ruiz](https://franrruiz.github.io/)</li> <li>[Julian Schrittwieser](https://www.furidamu.org/)</li> <li>[Grzegorz Swirszcz](https://sites.google.com/site/grzegorzswirszcz/home)</li> <li>[David Silver](https://www.davidsilver.uk/)</li> <li>[Demis Hassabis](https://en.wikipedia.org/wiki/Demis_Hassabis)</li> <li>[Pushmeet Kohli](https://sites.google.com/site/pushmeet/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4) [![](https://img.shields.io/github/stars/deepmind/alphatensor?style=social)](https://github.com/deepmind/alphatensor) <ul><li>[blog post](https://www.deepmind.com/blog/discovering-novel-algorithms-with-alphatensor)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/3N3Bl5AA5QU), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/gpYnDls4PdQ), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/IYgZS2EvnLI), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/8ILk4Wjo5rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/alphatensor/blob/master/nonequivalence/inspect_factorizations_notebook.ipynb) | 04.10.2022 |
    • Emilien Dupont - nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/jaxline)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/celeb_a_hq)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/functa/blob/main/modulation_visualization_colab.ipynb) | 24.09.2022 |
    • Ji Lin - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li></ul> | [![](https://img.shields.io/github/stars/mit-han-lab/anycost-gan?style=social)](https://github.com/mit-han-lab/anycost-gan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.03243)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/switchablenorms/CelebAMask-HQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fyu/lsun)</li><li>[project](https://hanlab.mit.edu/projects/anycost-gan/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=_yEziPl9AkM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mit-han-lab/anycost-gan/blob/master/notebooks/intro_colab.ipynb) | 20.07.2022 |
    • Shuyang Gu - us/research/people/fangwen/)</li><details><summary>others</summary><li>[Bo Zhang](https://bo-zhang.me/)</li> <li>[Dongdong Chen](http://www.dongdongchen.bid/)</li> <li>[Lu Yuan](https://scholar.google.com/citations?&user=k9TsUVsAAAAJ)</li> <li>[Baining Guo](https://scholar.google.com/citations?user=h4kYmRYAAAAJ)</li> <li>[Shuyang Gu](https://github.com/cientgu)</li> <li>[Zhicong Tang](https://github.com/zzctan)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/VQ-Diffusion?style=social)](https://github.com/microsoft/VQ-Diffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.14822), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.16007)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ehoogeboom/multinomial_diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/improved-diffusion)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Ws0_wK2cnsWEnfB7HtmPT4bjCPElb40C) | 30.06.2022 |
    • Xingyi Zhou - joulin/)</li> <li>[Philipp Krähenbühl](https://github.com/philkr)</li> <li>[Ishan Misra](https://imisra.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21) [![](https://img.shields.io/github/stars/facebookresearch/Detic?style=social)](https://github.com/facebookresearch/Detic) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.02605)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lvis-dataset/lvis-api)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1QtTW9-ukX2HKZGvt0QvVGqjuqEykoZKI) | 07.06.2022 |
    • Nathalie Pochet - gevaert)</li> <li>[Mohsen Nabian](https://github.com/monabiyan)</li> <li>[Jayendra Shinde](https://jayendrashinde91.github.io/)</li><details><summary>others</summary><li>[Celine Everaert](http://www.crig.ugent.be/en/node/510)</li> <li>[Thorin Tabor](http://thorin.tabcreations.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/gevaertlab/AMARETTO?style=social)](https://github.com/gevaertlab/AMARETTO) <ul><li>[bioconductor](https://bioconductor.org/packages/release/bioc/html/AMARETTO.html)</li><li>[project](http://portals.broadinstitute.org/pochetlab/amaretto.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JfnRoNgTVX_7VEGAAmjGjwP_yX2tdDxs) | 01.06.2022 |
    • Oscar Michel - On](https://github.com/roibaron)</li> <li>[Richard Liu](https://github.com/factoryofthesun)</li> <li>[Sagie Benaim](https://sagiebenaim.github.io/)</li> <li>[Rana Hanocka](http://people.cs.uchicago.edu/~ranahanocka/)</li></ul> | [![](https://img.shields.io/github/stars/threedle/text2mesh?style=social)](https://github.com/threedle/text2mesh) <ul><li>[CLIP](https://openai.com/blog/clip/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03221)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/neverix/text2mesh/notebook)</li><li>[project](https://threedle.github.io/text2mesh/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/threedle/text2mesh/blob/master/colab_demo.ipynb) | 14.05.2022 |
    • Colin Raffel - research/text-to-text-transfer-transformer?style=social)](https://github.com/google-research/text-to-text-transfer-transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.10683)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/mesh/tree/master/mesh_tensorflow/transformer)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) | 11.05.2022 |
    • Arun Babu - 4IAAAAJ)</li> <li>[Alexei Baevski](https://github.com/alexeib)</li> <li>[Alexis Conneau](https://github.com/aconneau)</li> <li>[Michael Auli](https://github.com/michaelauli)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/xlsr/README.md) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09296)</li><li>[blog post](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLS_R_on_Common_Voice.ipynb) | 10.05.2022 |
    • Hwanjun Song - heo/home)</li> <li>[Wonjae Kim](https://wonjae.kim/)</li> <li>[Ming-Hsuan Yang](http://faculty.ucmerced.edu/mhyang/)</li></ul></details> | [![](https://img.shields.io/github/stars/naver-ai/vidt?style=social)](https://github.com/naver-ai/vidt/tree/vidt-plus) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.07962), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.03921)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fundamentalvision/Deformable-DETR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/EherSenaw/ViDT_colab)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/EherSenaw/ViDT_colab/blob/main/vidt_colab.ipynb) | 20.04.2022 |
    • Yinhuai Wang - hu/)</li> <li>[Jian Zhang](http://jianzhang.tech/)</li></ul> | [![](https://img.shields.io/github/stars/jianzhangcs/panini?style=social)](https://github.com/jianzhangcs/panini) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.08444)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tkarras/progressive_growing_of_gans)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/GeeveGeorge/Panini-Net-Colab/blob/main/PaniniNet_Working.ipynb) | 13.04.2022 |
    • Fujun Luan - shechtman/)</li> <li>[Kavita Bala](https://www.cs.cornell.edu/~kb/)</li></ul> | [![](https://img.shields.io/github/stars/luanfujun/deep-painterly-harmonization?style=social)](https://github.com/luanfujun/deep-painterly-harmonization) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1804.03189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1701.08893)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jcjohnson/neural-style), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/torch/torch7), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/szagoruyko/loadcaffe)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/eyaler/5303782669fb43510d398bd346c6e3e6/deep-painterly-harmonization.ipynb) | 07.04.2022 |
    • Yael Vinker - bo.github.io/)</li> <li>[Roman Bachmann](https://roman-bachmann.github.io/)</li><details><summary>others</summary><li>[Amit Bermano](https://www.cs.tau.ac.il/~amberman/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li> <li>[Amir Zamir](https://vilab.epfl.ch/zamir/)</li> <li>[Ariel Shamir](https://faculty.runi.ac.il/arik/site/index.asp)</li></ul></details> | [![](https://img.shields.io/github/stars/yael-vinker/CLIPasso?style=social)](https://github.com/yael-vinker/CLIPasso) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.05822), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.14843)</li><li>[demo](https://replicate.com/yael-vinker/clipasso)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BachiLi/diffvg)</li><li>[project](https://clipasso.github.io/clipasso/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yael-vinker/CLIPasso/blob/main/CLIPasso.ipynb) | 21.03.2022 |
    • Patrick Esser - lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268) [![](https://img.shields.io/github/stars/CompVis/taming-transformers?style=social)](https://github.com/CompVis/taming-transformers) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841)</li><li>[project](https://compvis.github.io/taming-transformers/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/taming-transformers/blob/master/scripts/taming-transformers.ipynb) | 13.01.2022 |
    • Wonjong Jang - us/research/people/xtong/)</li> <li>[Seungyong Lee](https://scholar.google.com/citations?user=yGPH-nAAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/wonjongg/StyleCariGAN?style=social)](https://github.com/wonjongg/StyleCariGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.04331)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://wonjongg.github.io/StyleCariGAN/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=kpHbGOlI-BU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1HDRQGm7pvC9mAb6Lktoft_SmY9sCq_Qg) | 30.11.2021 |
    • Tobias Sunderdiek - badge.php?doi=10.1109/CVPR.2018.00986)](https://doi.org/10.1109/CVPR.2018.00986) <ul><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/alamson/safebooru)</li><li>[project](https://tobiassunderdiek.github.io/cartoon-gan/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/TobiasSunderdiek/cartoon-gan/blob/master/CartoonGAN.ipynb) | 24.11.2021 |
    • Xin Chen - badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18) [![](https://img.shields.io/github/stars/bryandlee/animegan2-pytorch?style=social)](https://github.com/bryandlee/animegan2-pytorch) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/TachibanaYoshino/AnimeGANv2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TachibanaYoshino/AnimeGAN)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/AnimeGANv2)</li><li>[project](https://tachibanayoshino.github.io/AnimeGANv2/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bryandlee/animegan2-pytorch/blob/master/colab_demo.ipynb) | 17.11.2021 |
    • Oran Lang - badge.php?doi=10.1109/ICCV48922.2021.00073)](https://doi.org/10.1109/ICCV48922.2021.00073) [![](https://img.shields.io/github/stars/google/explaining-in-style?style=social)](https://github.com/google/explaining-in-style) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.13369), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.10112), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.12799), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.04958), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.01711)</li><li>[blog post](https://ai.googleblog.com/2022/01/introducing-stylex-new-approach-for.html)</li><li>[project](https://explaining-in-style.github.io/)</li><li>[supplementary](https://explaining-in-style.github.io/supmat.html)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/wLk2eBdXH4M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/explaining-in-style/blob/main/Explaining_in_Style_AttFind.ipynb) | 25.08.2021 |
    • Jaehyeon Kim - demo/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1CO61pZizDj7en71NQG_aqqKdGaA_SaBf) | 23.08.2021 |
    • Daniel Roich - Or](https://danielcohenor.com/)</li></ul> | [![](https://img.shields.io/github/stars/danielroich/PTI?style=social)](https://github.com/danielroich/PTI) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.05744)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/richzhang/PerceptualSimilarity)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/danielroich/PTI/blob/main/notebooks/inference_playground.ipynb) | 01.07.2021 |
    • Yu-Lun Liu - Sheng Lai](https://www.wslai.net/)</li> <li>[Ming-Hsuan Yang](https://faculty.ucmerced.edu/mhyang/)</li> <li>[Yung-Yu Chuang](https://www.csie.ntu.edu.tw/~cyy/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/alex04072000/NeRViS?style=social)](https://github.com/alex04072000/NeRViS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06205)</li><li>[data](http://liushuaicheng.org/SIGGRAPH2013/database.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cxjyxxme/deep-online-video-stabilization), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jinsc37/DIFRINT)</li><li>[project](https://alex04072000.github.io/NeRViS/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/KO3sULs4hso)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1l-fUzyM38KJMZyKMBWw_vu7ZUyDwgdYH) | 11.04.2021 |
    • Jong Wook Kim - 2021/Slides/9193.pdf)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/clip/blob/master/Interacting_with_CLIP.ipynb) | 29.01.2021 |
    • Tom Brown - lab/cleverhans/blob/master/examples/adversarial_patch/AdversarialPatch.ipynb) | 27.01.2021 |
    • Rinon Gal - chechik)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530164)](https://doi.org/10.1145/3528223.3530164) [![](https://img.shields.io/github/stars/rinongal/StyleGAN-nada?style=social)](https://github.com/rinongal/StyleGAN-nada) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.00946), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17249), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.02699)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada)</li><li>[project](https://stylegan-nada.github.io/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/rinongal/stylegan-nada/blob/main/stylegan_nada.ipynb) | 09.08.2022 |
    • Konstantin Sofiiuk - inf.mpg.de/people/Petrov.html)</li> <li>[Olga Barinova](https://github.com/OlgaBarinova)</li> <li>[Anton Konushin](https://scholar.google.com/citations?user=ZT_k-wMAAAAJ)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00865)](https://doi.org/10.1109/CVPR42600.2020.00865) [![](https://img.shields.io/github/stars/SamsungLabs/fbrs_interactive_segmentation?style=social)](https://github.com/SamsungLabs/fbrs_interactive_segmentation) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2001.10331)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/HRNet/HRNet-Image-Classification)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ArcZ5xtyMCk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xg-5J9gLuXA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/SamsungLabs/fbrs_interactive_segmentation/blob/master/notebooks/colab_test_any_model.ipynb) | 25.01.2021 |
    • Prajwal Renukanand - badge.php?doi=10.1145/3394171.3413532)](https://doi.org/10.1145/3394171.3413532) [![](https://img.shields.io/github/stars/Rudrabha/Wav2Lip?style=social)](https://github.com/Rudrabha/Wav2Lip) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.10010)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/lip_reading/lrs2.html)</li><li>[demo](http://bhaasha.iiit.ac.in/lipsync/)</li><li>[project](http://cvit.iiit.ac.in/research/projects/cvit-projects/a-lip-sync-expert-is-all-you-need-for-speech-to-lip-generation-in-the-wild/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=0fXaDCZNOJc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eyaler/avatars4all/blob/master/melaflefon.ipynb) | 27.06.2024 |
    • Alexander Mathis - badge.php?doi=10.1038/s41593-018-0209-y)](https://doi.org/10.1038/s41593-018-0209-y) [![](https://img.shields.io/github/stars/DeepLabCut/DeepLabCut?style=social)](https://github.com/DeepLabCut/DeepLabCut) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1605.03170), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1804.03142), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.11229), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2009.00564), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.13868), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.13868)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/deeplabcut/deeplabcut)</li><li>[forum](https://forum.image.sc/tag/deeplabcut)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DeepLabCut/DLCutils), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DeepLabCut/DeepLabCut-Workshop-Materials)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@cziscience/how-open-source-software-contributors-are-accelerating-biomedicine-1a5f50f6846a)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/DeepLabCut)</li><li>[website](https://www.deeplabcut.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/@deeplabcut7702), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uWZu3rnj-kQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Teb5r2TNAYs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/DeepLabCut/DeepLabCut/blob/master/examples/COLAB/COLAB_maDLC_TrainNetwork_VideoAnalysis.ipynb) | 05.06.2024 |
    • Zinan Guo - ZHO-ZHO/ComfyUI-PuLID-ZHO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mikubill/sd-webui-controlnet/pull/2838)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/comfyui/comments/1cnv269/pulid_pure_and_lightning_id_customization_via/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/PuLID-jupyter/blob/main/PuLID_jupyter.ipynb) | 03.05.2024 |
    • Puyuan Peng - Yao Huang](https://berniebear.github.io/)</li> <li>[Shang-Wen Li](https://swdanielli.github.io/)</li> <li>[Abdelrahman Mohamed](https://www.cs.toronto.edu/~asamir/)</li> <li>[David Harwath](https://www.cs.utexas.edu/~harwath/)</li></ul> | [![](https://img.shields.io/github/stars/jasonppy/VoiceCraft?style=social)](https://github.com/jasonppy/VoiceCraft) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2403.16973)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lifeiteng/vall-e)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/pyp1/VoiceCraft)</li><li>[project](https://jasonppy.github.io/VoiceCraft_web/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/LocalLLaMA/comments/1bmxfk3/voicecraft_zeroshot_speech_editing_and/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eikybOi8iwU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PJ2qSjycLcw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JxRrHpq-hys)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jasonppy/VoiceCraft/blob/master/voicecraft-gradio-colab.ipynb) | 21.04.2024 |
    • Ta-Ying Cheng - ZeroShot-MTrans)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/h94/IP-Adapter), [<img src="images/hf.svg" alt="hf" height=20/>](https://github.com/intel-isl/DPT/releases/download/1_0/dpt_hybrid-midas-501f0c75.pt)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://xthemadgenius.medium.com/zest-unlocks-material-magic-in-single-image-transfers-05f7ff7ee483)</li><li>[project](https://ttchengab.github.io/zest/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/learnmachinelearning/comments/1c0wpjd/zest_zeroshot_material_transfer_from_a_single/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/atG1VvgeG_g)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/zest-jupyter/blob/main/zest_jupyter.ipynb) | 16.04.2024 |
    • Jiale Xu - McAAAAJ)</li> <li>[Xintao Wang](https://xinntao.github.io/)</li><details><summary>others</summary><li>[Shenghua Gao](https://scholar.google.com/citations?user=fe-1v0MAAAAJ)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/TencentARC/InstantMesh?style=social)](https://github.com/TencentARC/InstantMesh) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2404.07191)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/danielgatis/rembg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/3DTopia/OpenLRM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nv-tlabs/FlexiCubes)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/TencentARC/InstantMesh)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1c5hs3e/instantmesh_efficient_3d_mesh_generation_from_a/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BvngSJOStvQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/InstantMesh-jupyter/blob/main/InstantMesh_jupyter.ipynb) | 16.04.2024 |
    • Roman Suvorov - %D1%81%D0%B8%D0%BB%D1%8C%D0%B2%D0%B5%D1%81%D1%82%D1%80%D0%BE%D0%B2-141b99b6/)</li> <li>[Naejin Kong](https://github.com/naejin-kong)</li> <li>[Harshith Goka](https://github.com/h9399-goka)</li> <li>[Kiwoong Park](https://github.com/kyoong-park)</li> <li>[Victor Lempitsky](http://sites.skoltech.ru/compvision/members/vilem/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/WACV51458.2022.00323)](https://doi.org/10.1109/WACV51458.2022.00323) [![](https://img.shields.io/github/stars/saic-mdal/lama?style=social)](https://github.com/saic-mdal/lama) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.07161)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/andy971022/auto-lama), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/richzhang/PerceptualSimilarity), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Po-Hsun-Su/pytorch-ssim), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mseitzer/pytorch-fid)</li><li>[project](https://saic-mdal.github.io/lama-project/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/saic-mdal/lama/blob/master/colab/LaMa_inpainting.ipynb) | 01.08.2023 |
    • Yue Ma - badge.php?doi=10.1609/aaai.v38i5.28206)](https://doi.org/10.1609/aaai.v38i5.28206) [![](https://img.shields.io/github/stars/mayuelala/FollowYourPose?style=social)](https://github.com/mayuelala/FollowYourPose) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.01186), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10752)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/bryandlee/Tune-A-Video), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmpose)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/YueMafighting/FollowYourPose_v1/tree/main), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/CompVis/stable-diffusion-v1-4)</li><li>[project](https://follow-your-pose.github.io/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://github.com/mayuelala)</li><li>[video](https://underline.io/lecture/91712-follow-your-pose-pose-guided-text-to-video-generation-using-pose-free-videos)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mayuelala/FollowYourPose/blob/main/quick_demo.ipynb) | 07.04.2023 |
    • Thomas Müller - evans)</li> <li>[Christoph Schied](https://research.nvidia.com/person/christoph-schied)</li> <li>[Alexander Keller](https://research.nvidia.com/person/alex-keller)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530127)](https://doi.org/10.1145/3528223.3530127) [![](https://img.shields.io/github/stars/NVlabs/instant-ngp?style=social)](https://github.com/NVlabs/instant-ngp) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.05989)</li><li>[blog post](https://developer.nvidia.com/blog/getting-started-with-nvidia-instant-nerfs/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/tiny-cuda-nn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDLabMedia/large-lightfields-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nickponline/dd-nerf-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ocornut/imgui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nothings/stb)</li><li>[project](https://nvlabs.github.io/instant-ngp/)</li><li>[tutorial](https://www.nvidia.com/en-us/on-demand/session/siggraph2022-sigg22-s-16/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/j8tMk-GE8hY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8GbENSmdVeE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/DJ2hcC1orc4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/z3-fjYzd0BA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/NVlabs/instant-ngp/blob/master/notebooks/instant_ngp.ipynb) | 18.01.2023 |
    • Marcos Conde - Jin Choi](https://github.com/Choiuijin1125)</li> <li>[Maxime Burchi](https://scholar.google.com/citations?user=7S_l2eAAAAAJ)</li> <li>[Radu Timofte](https://www.informatik.uni-wuerzburg.de/computervision/home/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-25063-7_42)](https://doi.org/10.1007/978-3-031-25063-7_42) [![](https://img.shields.io/github/stars/mv-lab/swin2sr?style=social)](https://github.com/mv-lab/swin2sr) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.11345), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.10257), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.11184), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09883)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/KAIR/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mv-lab/AISP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Swin-Transformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/jjourney1125/swin2sr)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/jesucristo/super-resolution-demo-swin2sr-official/), [<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/datasets/jesucristo/super-resolution-benchmarks), [<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/jinssaa/official-swin2sr-demo-results/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1paPrt62ydwLv2U2eZqfcFsePI4X4WRR1) | 03.10.2022 |
    • Adam Botach - badge.php?doi=10.1109/CVPR52688.2022.00493)](https://doi.org/10.1109/CVPR52688.2022.00493) [![](https://img.shields.io/github/stars/mttr2021/MTTR?style=social)](https://github.com/mttr2021/MTTR) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.14821), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.11692), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.13230)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Video-Swin-Transformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/MTTR/MTTR-Referring-Video-Object-Segmentation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YqlhXgq6hcs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12p0jpSx3pJNfZk-y_L44yeHZlhsKVra-) | 20.06.2022 |
    • Suman Ravuri - willson-6a1b422)</li> <li>[Dmitry Kangin](https://scholar.google.com/citations?user=vv-leaMAAAAJ)</li><details><summary>others</summary><li>[Rémi Lam](https://github.com/remilam)</li> <li>[Piotr Mirowski](https://piotrmirowski.com/)</li> <li>[Maria Athanassiadou](https://scholar.google.com/citations?user=VtkgHP0AAAAJ)</li> <li>[Sheleem Kashem](https://www.linkedin.com/in/sheleemkashem/)</li> <li>[Rachel Prudden](https://computerscience.exeter.ac.uk/staff/rep218)</li> <li>[Amol Mandhane](https://github.com/amol-mandhane)</li> <li>[Aidan Clark](https://scholar.google.com/citations?user=_19DrfIAAAAJ)</li> <li>[Andrew Brock](https://github.com/ajbrock)</li> <li>[Karen Simonyan](https://scholar.google.com/citations?user=L7lMQkQAAAAJ)</li> <li>[Raia Hadsell](https://github.com/raiah)</li> <li>[Niall Robinson](https://github.com/niallrobinson)</li> <li>[Ellen Clancy](https://www.linkedin.com/in/ellen-clancy-815967124)</li> <li>[Shakir Mohamed](https://www.shakirm.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03854-z)](https://doi.org/10.1038/s41586-021-03854-z) [![](https://img.shields.io/github/stars/deepmind/deepmind-research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/nowcasting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.00954)</li><li>[blog post](https://deepmind.com/blog/article/nowcasting)</li><li>[local kernel](https://research.google.com/colaboratory/local-runtimes.html)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/hub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind-research/blob/master/nowcasting/Open_sourced_dataset_and_model_snapshot_for_precipitation_nowcasting.ipynb) | 29.09.2021 |
    • Yuanxun Lu - badge.php?doi=10.1145/3478513.3480484)](https://doi.org/10.1145/3478513.3480484) [![](https://img.shields.io/github/stars/YuanxunLu/LiveSpeechPortraits?style=social)](https://github.com/YuanxunLu/LiveSpeechPortraits) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.10595)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lelechen63/ATVGnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lelechen63/Talking-head-Generation-with-Rhythmic-Head-Motion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DinoMan/speech-driven-animation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix)</li><li>[project](https://yuanxunlu.github.io/projects/LiveSpeechPortraits/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1tKvi-9kY3GkEK8lgtfTSM70rMFo_TY50) | 26.09.2021 |
    • Min Jin Chong - pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/znxlwm/UGATIT-pytorch)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VNg0NyCGl_4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mchong6/GANsNRoses/blob/master/inference_colab.ipynb) | 19.06.2021 |
    • Qianli Ma - Moll](https://virtualhumans.mpi-inf.mpg.de/)</li> <li>[Siyu Tang](https://scholar.google.com/citations?user=BUDh_4wAAAAJ)</li> <li>[Michael Black](https://ps.is.mpg.de/~black)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00650)](https://doi.org/10.1109/CVPR42600.2020.00650) [![](https://img.shields.io/github/stars/qianlim/CAPE?style=social)](https://github.com/qianlim/CAPE) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.13615), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.10267), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.02658)</li><li>[data](https://cape.is.tue.mpg.de/dataset)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/MPI-IS/mesh), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/anuragranj/coma)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@mahyarfardinfar/learning-to-dress-3d-people-in-generative-clothing-486eb90136ff)</li><li>[project](https://cape.is.tue.mpg.de/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/e4W-hPFNwDE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NOEA-Rtq6vM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1DCNo2OyyTNi1xDG-7j32FZQ9sBA6i9Ys) | 05.08.2020 |
    • Angjoo Kanazawa - badge.php?doi=10.1109/CVPR.2018.00744)](https://doi.org/10.1109/CVPR.2018.00744) [![](https://img.shields.io/github/stars/akanazawa/hmr?style=social)](https://github.com/akanazawa/hmr) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.06584)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/dawars/hmr/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mattloper/chumpy), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CMU-Perceptual-Computing-Lab/openpose), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/MandyMo/pytorch_HMR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/layumi/hmr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/russoale/hmr2.0)</li><li>[project](https://akanazawa.github.io/hmr/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bmMV9aJKa-c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Dene33/video_to_bvh/blob/master/video_to_bvh.ipynb) | 15.03.2019 |
    • Jesse Engel - fastgen)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=AaALLWQmCdI), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=BOoSy-Pg8is)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/nsynth/nsynth.ipynb) | 06.04.2017 |
    • Doron Adler - colab-experiments/blob/master/WikiArt_Example_Generation_By_Peter_Baylies.ipynb) | 27.01.2020 |
    • Vladimir Iashin - iashin/SpecVQGAN?style=social)](https://github.com/v-iashin/SpecVQGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/2110.08791), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1711.00937), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.00820), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.01393), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1512.08512)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/PeihaoChen/regnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/toshas/torch-fidelity), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/descriptinc/melgan-neurips), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/lyra)</li><li>[project](https://iashin.ai/SpecVQGAN)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Foley_(filmmaking)), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Row-_and_column-major_order), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=Bucb3nAa398)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1pxTIMweAKApJZ3ZFqyBee3HtMqFpnwQ0) | 12.07.2024 |
    • Jianzhu Guo - lxq)</li> <li>[Zhizhou Zhong](https://scholar.google.com/citations?user=t88nyvsAAAAJ)</li><details><summary>others</summary><li>[Yuan Zhang](https://scholar.google.com/citations?user=_8k1ubAAAAAJ)</li> <li>[Pengfei Wan](https://scholar.google.com/citations?user=P6MraaYAAAAJ)</li> <li>[Di Zhang](https://openreview.net/profile?id=~Di_ZHANG3)</li></ul></details> | [![](https://img.shields.io/github/stars/KwaiVGI/LivePortrait?style=social)](https://github.com/KwaiVGI/LivePortrait) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2407.03168)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/kijai/ComfyUI-LivePortraitKJ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/shadowcz007/comfyui-liveportrait), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhanglonghao1992/One-Shot_Free-View_Neural_Talking_Head_Synthesis), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/SPADE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepinsight/insightface)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/KwaiVGI/LivePortrait)</li><li>[project](https://liveportrait.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1dvepjx/liveportrait_efficient_portrait_animation_with/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uyjSTAOY7yI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8-IcDDmiUMM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/aFcS31OWMjE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bRHf2oQwgG4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FPtpNrmuwXk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wG7oPp01COg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/LivePortrait-jupyter/blob/main/LivePortrait_jupyter.ipynb) | 10.07.2024 |
    • Carl Doersch - 7_cAAAAJ)</li> <li>[Andrew Zisserman](https://www.robots.ox.ac.uk/~az/)</li></ul></details> | [![](https://img.shields.io/github/stars/google-deepmind/tapnet?style=social)](https://github.com/google-deepmind/tapnet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.08637), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.15975)</li><li>[blog post](https://deepmind-tapir.github.io/), [blog post](https://deepmind-tapir.github.io/blogpost.html)</li><li>[<img src="images/deepmind.svg" alt="deepmind" height=20/>](https://www.deepmind.com/open-source/kinetics)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/kubric/tree/main/challenges/point_tracking)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@jumabek4044/what-is-tapir-tracking-any-point-with-per-frame-initialization-and-temporal-refinement-and-how-it-bdad9946dc53)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2022/hash/58168e8a92994655d6da3939e7cc0918-Abstract-Datasets_and_Benchmarks.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2HSHofqoJ9M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/I1DQJH3v7Nk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/tapnet/blob/master/colabs/causal_tapir_demo.ipynb) | 05.07.2024 |
    • Weihao Yu - badge.php?doi=10.1109/CVPR52688.2022.01055)](https://doi.org/10.1109/CVPR52688.2022.01055) [![](https://img.shields.io/github/stars/sail-sg/poolformer?style=social)](https://github.com/sail-sg/poolformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.11418)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fvcore), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/apex)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/poolformer)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sail-sg/poolformer/blob/main/misc/poolformer_demo.ipynb) | 01.06.2024 |
    • Chien-Yao Wang - Hau Yeh](https://ieeexplore.ieee.org/author/37088448531)</li> <li>[Hong-Yuan Mark Liao](https://homepage.iis.sinica.edu.tw/pages/liao/index_zh.html)</li></ul> | [![](https://img.shields.io/github/stars/WongKinYiu/yolov9?style=social)](https://github.com/WongKinYiu/yolov9) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2402.13616), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.16921)</li><li>[blog post](https://learnopencv.com/yolov9-advancing-the-yolo-legacy/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/WongKinYiu/yolor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/VDIGPKU/DynamicDet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DingXiaoH/RepVGG)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/kadirnar/Yolov9), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/merve/yolov9)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@Mert.A/how-to-use-yolov9-for-object-detection-93598ad88d7d)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XHT2c8jT3Bc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3iLJ6YWPg28), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dccf_sJF0Gg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/train-yolov9-object-detection-on-custom-dataset.ipynb) | 05.03.2024 |
    • Bernhard Kerbl - inf.mpg.de/~tleimkue/)</li> <li>[George Drettakis](http://www-sop.inria.fr/members/George.Drettakis/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433) [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.04079)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/camenduru/gaussian-splatting)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/axinc-ai/3d-gaussian-splatting-real-time-rendering-of-photorealistic-scenes-f7f1a47f060)</li><li>[project](https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/singularity/comments/163jeqa/3d_gaussian_splatting_for_realtime_radiance_field/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/T_kXY43VZnk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/UXtuigy_wYc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HVv_IQKlafQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w43KV79LsFw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TLK3TDDcJFU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kShNYOuDnlI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/juRMRej2d5c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/gaussian-splatting-colab/blob/main/gaussian_splatting_colab.ipynb) | 19.12.2023 |
    • Marco Pasini - cnn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CPJKU/madmom)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/marcop/musika)</li><li>[project](https://marcoppasini.github.io/musika)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QBl8y2Z_i7Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0l7OSM-bFvc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1PowSw3doBURwLE-OTCiWkO8HVbS5paRb) | 09.10.2023 |
    • Kaiheng Weng - object-detection/)</li><li>[data](https://cocodataset.org/#download)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://yolov6-docs.readthedocs.io/zh_CN/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiGeChuanShu/ncnn-android-yolov6), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DefTruth/lite.ai.toolkit/blob/main/lite/ort/cv/yolov6.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Linaom1214/TensorRT-For-YOLO-Series), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhiqwang/yolov5-rt-stack/tree/main/deployment/tensorrt-yolov6)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3OpwcGU7VvE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GJ0lVOE3a7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3hqkbqJ5ag8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fFCWrMFH2UY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/meituan/YOLOv6/blob/master/turtorial.ipynb) | 08.10.2023 |
    • Jian Zhao - nb/Thin-Plate-Spline-Motion-Model?style=social)](https://github.com/yoyo-nb/Thin-Plate-Spline-Motion-Model) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.14367)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/monkey-net), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/video-preprocessing), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AliaksandrSiarohin/pose-evaluation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TalkUHulk/Image-Animation-Turbo-Boost)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/CVPR/Image-Animation-using-Thin-Plate-Spline-Motion-Model)</li><li>[supp](https://cloud.tsinghua.edu.cn/f/f7b8573bb5b04583949f/?dl=1)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1DREfdpnaBhqISg0fuQlAAIwyGVn1loH_) | 07.07.2023 |
    • Xu Zhao - ding)</li> <li>[Yongqi An](https://github.com/an-yongqi)</li> <li>[Yinglong Du](https://github.com/YinglongDu)</li><details><summary>others</summary><li>[Tao Yu](https://github.com/tianjinren)</li> <li>[Min Li](https://github.com/limin2021)</li> <li>[Ming Tang](https://www.researchgate.net/profile/Ming-Tang-2)</li> <li>[Jinqiao Wang](https://scholar.google.com/citations?user=7_BkyxEAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/CASIA-IVA-Lab/FastSAM?style=social)](https://github.com/CASIA-IVA-Lab/FastSAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.12156), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10003)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ChuRuaNh0/FastSam_Awsome_TensorRT)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@mahimairaja/so-what-exactly-is-fastsam-the-ultimate-guide-ddae21d3b486)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yHNPyqazYYU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SslzS0AsiAw), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/qvqkjP1wCDE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1oX14f6IneGGw612WgVlAiy91UHwFAvr9) | 30.06.2023 |
    • Shilong Liu - cv.github.io/)</li> <li>[Chunyuan Li](https://scholar.google.com/citations?user=Zd7WmXUAAAAJ)</li> <li>[Jianwei Yang](https://jwyang.github.io/)</li> <li>[Hang Su](https://www.suhangss.me/)</li> <li>[Jun Zhu](https://scholar.google.com/citations?user=axsP38wAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05499)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/DINO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Semantic-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OptimalScale/DetGPT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OpenSeeD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/X-Decoder/tree/xgpt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/detrex)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/zero-shot-object-detection-on-mscoco?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/zero-shot-object-detection-on-odinw?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/object-detection-on-coco-minival?p=grounding-dino-marrying-dino-with-grounded), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/object-detection-on-coco?p=grounding-dino-marrying-dino-with-grounded)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wxWDt5UiwY8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/cMa77r3YrDk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/C4NqaRBz_Kw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oEQYStnF2l8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb) | 28.06.2023 |
    • Vineel Pratap - kundu)</li> <li>[Ali Elkahky](https://scholar.google.com/citations?user=KB3S8RoAAAAJ)</li> <li>[Zhaoheng Ni](https://scholar.google.com/citations?user=SYFMSNsAAAAJ)</li> <li>[Apoorv Vyas](https://apoorv2904.github.io/)</li> <li>[Maryam Fazel-Zarandi](https://www.maryamfazel.com/)</li> <li>[Alexei Baevski](https://github.com/alexeib)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Xiaohui Zhang](https://github.com/xiaohui-zhang)</li> <li>[Wei-Ning Hsu](https://wnhsu.github.io/)</li> <li>[Alexis Conneau](https://github.com/aconneau)</li> <li>[Michael Auli](https://github.com/michaelauli)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq/tree/main/examples/mms) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13516)</li><li>[blog post](https://ai.facebook.com/blog/multilingual-model-speech-recognition/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/model_doc/mms), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/mms-cclms/), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/mms_adapters)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GEzxHxWys2s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g06agCmxS7I)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/fairseq/blob/main/examples/mms/asr/tutorial/MMS_ASR_Inference_Colab.ipynb) | 26.05.2023 |
    • Vage Egiazarian - directory), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T-Sample), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/Vahe1994/AQLM)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/LearningMachines/comments/1atvrnl/240106118_extreme_compression_of_large_language/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qx8PNk4OkUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hAHBKAXO-88)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Vahe1994/AQLM/blob/main/notebooks/colab_example.ipynb) | 08.03.2024 |
    • Zeming Lin - zhu-03a27424)</li><details><summary>others</summary><li>[Allan dos Santos Costa](https://scholar.google.com/citations?user=Zb4RsFsAAAAJ)</li> <li>[Maryam Fazel-Zarandi](https://www.maryamfazel.com/)</li> <li>[Tom Sercu](https://tom.sercu.me/)</li> <li>[Salvatore Candido](https://scholar.google.com/citations?user=BDgbhmEAAAAJ)</li> <li>[Alexander Rives](https://scholar.google.com/citations?user=vqb78-gAAAAJ)</li> <li>[Joshua Meier](https://scholar.google.com/citations?user=2M0OltAAAAAJ)</li> <li>[Robert Verkuil](https://dblp.org/pid/296/8930.html)</li> <li>[Jason Liu](https://www.linkedin.com/in/liujiayi/)</li> <li>[Chloe Hsu](https://chloe-hsu.com/)</li> <li>[Adam Lerer](https://scholar.google.com/citations?user=Ad6O4-0AAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1101/622803)](https://doi.org/10.1101/622803) [![](https://img.shields.io/github/stars/facebookresearch/esm?style=social)](https://github.com/facebookresearch/esm) <ul><li>[ESM Atlas](https://esmatlas.com/)</li><li>[FSDP](https://fairscale.readthedocs.io/en/stable/api/nn/fsdp.html)</li><li>[ICML](https://proceedings.mlr.press/v139/rao21a.html)</li><li>[data](https://ftp.uniprot.org/pub/databases/uniprot/previous_releases/release-2018_03/uniref/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/sokrypton/ColabFold)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/esm)</li><li>[paper](https://doi.org/10.1101/2022.07.20.500902), [paper](https://doi.org/10.1101/2021.07.09.450648), [paper](https://doi.org/10.1101/2022.04.10.487779), [paper](https://doi.org/10.1101/2022.12.21.521521)</li><li>[pubmed](https://pubmed.ncbi.nlm.nih.gov/33876751/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/N-eisTvUYrk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GHoE4VkDehY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sokrypton/ColabFold/blob/main/ESMFold.ipynb) | 28.12.2023 |
    • Tomoki Hayashi - badge.php?doi=10.1109/ICASSP40776.2020.9053795)](https://doi.org/10.1109/ICASSP40776.2020.9053795) [![](https://img.shields.io/github/stars/kan-bayashi/ParallelWaveGAN?style=social)](https://github.com/kan-bayashi/ParallelWaveGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.11480), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.06711), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05106)</li><li>[demo](https://kan-bayashi.github.io/ParallelWaveGAN/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/tacotron2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/espnet/espnet)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/espnet/notebook/blob/master/espnet2_tts_realtime_demo.ipynb) | 01.06.2023 |
    • Xuanhong Chen - badge.php?doi=10.1145/3394171.3413630)](https://doi.org/10.1145/3394171.3413630) [![](https://img.shields.io/github/stars/neuralchen/SimSwap?style=social)](https://github.com/neuralchen/SimSwap) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.06340)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepinsight/insightface)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/neuralchen/SimSwap/blob/master/SimSwap%20colab.ipynb) | 24.11.2021 |
    • Weihao Xia - Hao Xue](http://www.homepages.ucl.ac.uk/~ucakjxu/)</li> <li>[Baoyuan Wu](https://sites.google.com/site/baoyuanwu2015/home)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00229)](https://doi.org/10.1109/CVPR46437.2021.00229) [![](https://img.shields.io/github/stars/IIGROUP/TediGAN?style=social)](https://github.com/IIGROUP/TediGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.03308), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.08910)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/weihaox/Multi-Modal-CelebA-HQ), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fyu/lsun)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L8Na2f5viAM)</li></ul> | [![Open In Colab](images/colab.svg)](http://colab.research.google.com/github/weihaox/TediGAN/blob/master/playground.ipynb) | 30.06.2021 |
    • Ming Ding - gEAAAAJ)</li> <li>[Wenyi Hong](https://github.com/wenyihong)</li> <li>[Wendi Zheng](https://github.com/minkowski0125)</li><details><summary>others</summary><li>[Chang Zhou](https://scholar.google.com/citations?user=QeSoG3sAAAAJ)</li> <li>[Junyang Lin](https://justinlin610.github.io/)</li> <li>[Xu Zou](http://xuzou.cn/)</li> <li>[Zhou Shao](https://www.researchgate.net/profile/Shao_Zhou4)</li> <li>[Hongxia Yang](https://sites.google.com/site/hystatistics/home)</li> <li>[Jie Tang](https://keg.cs.tsinghua.edu.cn/jietang/)</li></ul></details> | [![](https://img.shields.io/github/stars/THUDM/CogView?style=social)](https://github.com/THUDM/CogView) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.13290)</li><li>[demo](https://thudm.github.io/CogView/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/apex), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Sleepychord/cogdata)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/cogview-image-generation-and-language-modelling-at-scale-8d358a0686d2)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/a4d92e2cd541fca87e4620aba658316d-Abstract.html)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/nmxsd8/r_cogview_mastering_texttoimage_generation_via/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Cw1r8ACIj8U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Bi2TnSUp2vNiSUhamsNuC4HqkZ2J4WwZ) | 21.06.2021 |
    • Ceyuan Yang - badge.php?doi=10.1007/s11263-020-01429-5)](https://doi.org/10.1007/s11263-020-01429-5) [![](https://img.shields.io/github/stars/genforce/higan?style=social)](https://github.com/genforce/higan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.09267), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1412.6856), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.10112)</li><li>[project](https://genforce.github.io/higan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=X5yWu2Jwjpg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/genforce/higan/blob/master/docs/HiGAN_Bedroom.ipynb) | 14.10.2020 |
    • Victor Dibia - badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11) [![](https://img.shields.io/github/stars/microsoft/lida?style=social)](https://github.com/microsoft/lida) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.02927)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/victordibia/llmx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lida-project/lida-streamlit)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@c17hawke/lida-automatically-generate-visualization-and-with-llms-the-future-of-data-visualization-6bc556876b46)</li><li>[project](https://microsoft.github.io/lida/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/exYi9W-dhME), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U9K1Cu45nMQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6xcCwlDx6f8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/lida/blob/main/notebooks/tutorial.ipynb) | 06.02.2024 |
    • Ziqiang Zhang - zhou.github.io/)</li> <li>[Chengyi Wang](https://cywang97.github.io/)</li> <li>[Sanyuan Chen](https://sanyuan-chen.github.io/)</li><details><summary>others</summary><li>[Yu Wu](https://www.microsoft.com/en-us/research/people/yuwu1/)</li> <li>[Shujie Liu](https://www.microsoft.com/en-us/research/people/shujliu/)</li> <li>[Zhuo Chen](https://www.microsoft.com/en-us/research/people/zhuc/)</li> <li>[Yanqing Liu](https://scholar.google.com/citations?user=dIJFz4UAAAAJ)</li> <li>[Huaming Wang](https://scholar.google.com/citations?user=aJDLg5IAAAAJ)</li> <li>[Jinyu Li](https://www.microsoft.com/en-us/research/people/jinyli/)</li> <li>[Lei He](https://scholar.google.com/citations?user=EKl9yY8AAAAJ)</li> <li>[Sheng Zhao](https://scholar.google.com/citations?user=689bIIwAAAAJ)</li> <li>[Furu Wei](https://www.microsoft.com/en-us/research/people/fuwei/)</li></ul></details> | [![](https://img.shields.io/github/stars/Plachtaa/VALL-E-X?style=social)](https://github.com/Plachtaa/VALL-E-X) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.03926), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.03143)</li><li>[demo](https://plachtaa.github.io/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/qCBRmAnTxg)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lifeiteng/vall-e)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Plachta/VALL-E-X)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/speak-a-foreign-language-in-your-own-voice-1dafa42f78d9)</li><li>[project](https://www.microsoft.com/en-us/research/project/vall-e-x)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7qgfoVFQmvk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1yyD_sz531QntLKowMHo-XxorsFBCfKul) | 19.01.2024 |
    • David Junhao Zhang - wei-liu.github.io/)</li> <li>[Rui Zhao](https://ruizhaocv.github.io/)</li><details><summary>others</summary><li>[Lingmin Ran](https://siacorplab.nus.edu.sg/people/ran-lingmin/)</li> <li>[Yuchao Gu](https://ycgu.site/)</li> <li>[Difei Gao](https://scholar.google.com/citations?user=No9OsocAAAAJ)</li> <li>[Mike Zheng Shou](https://sites.google.com/view/showlab/home)</li></ul></details> | [![](https://img.shields.io/github/stars/showlab/Show-1?style=social)](https://github.com/showlab/Show-1) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.15818)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-base), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-interpolation), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-sr1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/showlab/show-1-sr2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cerspense/zeroscope_v2_576w)</li><li>[project](https://showlab.github.io/Show-1/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Show-1-colab/blob/main/Show_1_steps_colab.ipynb) | 15.10.2023 |
    • Wenxuan Zhang - xjtu.github.io/)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li> <li>[Fei Wang](http://gr.xjtu.edu.cn/zh/web/feynmanw)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836) [![](https://img.shields.io/github/stars/OpenTalker/SadTalker?style=social)](https://github.com/OpenTalker/SadTalker) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.12194)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/rrayYqZ4tf)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhanglonghao1992/One-Shot_Free-View_Neural_Talking_Head_Synthesis), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RenYurui/PIRender), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Deep3DFaceReconstruction), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Zz-ww/SadTalker-Video-Lip-Sync), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenTalker/DPE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiiYin/SPI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mael-zys/T2M-GPT)</li><li>[project](https://sadtalker.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AoIzJWnQw1M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fDgQcDL-qOc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BkSnM9cxkcM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7u0FYVPQ5rc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/OpenTalker/SadTalker/blob/main/quick_demo.ipynb) | 10.10.2023 |
    • Chenfei Wu - yin)</li> <li>[Weizhen Qi](https://github.com/WeizhenQ)</li> <li>[Xiaodong Wang](https://wang-xiaodong1899.github.io/)</li><details><summary>others</summary><li>[Zecheng Tang](https://github.com/CODINNLG)</li> <li>[Nan Duan](https://nanduan.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/visual-chatgpt?style=social)](https://github.com/microsoft/visual-chatgpt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.04671)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/hwchase17/langchain), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lllyasviel/ControlNet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timothybrooks/instruct-pix2pix), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timojl/clipseg)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0UfXlFUwLms), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7YEiEyfPF5U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11BtP3h-w0dZjA-X8JsS9_eo8OeGYvxXB) | 15.03.2023 |
    • Lili Chen - grover.github.io/)</li> <li>[Michael Laskin](https://www.mishalaskin.com/)</li> <li>[Pieter Abbeel](http://people.eecs.berkeley.edu/~pabbeel/)</li> <li>[Aravind Srinivas](https://github.com/aravindsrinivas)</li> <li>[Igor Mordatch](https://scholar.google.com/citations?user=Vzr1RukAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/kzl/decision-transformer?style=social)](https://github.com/kzl/decision-transformer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01345)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?other=gym-continous-control), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/edbeeching/decision-transformer-gym-hopper-expert), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/decision_transformer)</li><li>[project](https://sites.google.com/berkeley.edu/decision-transformer)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Autoregressive_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/k08N5a0gG0A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-buULmf7dec), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/83QN9S-0I84), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w4Bw8WYL8Ps)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1K3UuajwoPY1MzRKNkONNRS3gS5DxZ-qF) | 06.09.2022 |
    • Jianglin Fu - Yee Lin](https://kwanyeelin.github.io/)</li><details><summary>others</summary><li>[Chen Qian](https://scholar.google.com/citations?user=AerkT0YAAAAJ)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li> <li>[Wayne Wu](https://wywu.github.io/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1) [![](https://img.shields.io/github/stars/stylegan-human/stylegan-human?style=social)](https://github.com/stylegan-human/stylegan-human) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.11823)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3)</li><li>[project](https://stylegan-human.github.io/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/market-1501)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nIrb9hwsdcI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/86b49sCz0Gg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g3nmM6MdxwY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/p2uwqh_SFL8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1sgxoDM55iM07FS54vz9ALg1XckiYA2On) | 19.08.2022 |
    • Hansheng Chen - QIwAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/tjiiv-cprg/EPro-PnP?style=social)](https://github.com/tjiiv-cprg/EPro-PnP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13254)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/megvii-research/petr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HuangJunJie2017/BEVDet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/fudan-zvg/PolarFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zhiqi-li/BEVFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmdetection3d)</li><li>[nuScenes](https://www.nuscenes.org/object-detection?externalData=no&mapData=no&modalities=Camera)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TonBodQ6EUU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tjiiv-cprg/EPro-PnP/blob/main/demo/fit_identity.ipynb) | 12.07.2022 |
    • Susan Zhang - y6SIhQAAAAJ)</li> <li>[Xi Victoria Lin](http://victorialin.net/)</li> <li>[Todor Mihaylov](https://github.com/tbmihailov)</li> <li>[Myle Ott](https://myleott.com/)</li> <li>[Sam Shleifer](https://github.com/sshleifer)</li> <li>[Kurt Shuster](https://github.com/klshuster)</li> <li>[Daniel Simig](https://scholar.google.com/citations?user=TtWU9fsAAAAJ)</li> <li>[Punit Singh Koura](https://github.com/punitkoura)</li> <li>[Anjali Sridhar](https://www.linkedin.com/in/anjalisridhar/)</li> <li>[Tianlu Wang](https://tianlu-wang.github.io/)</li> <li>[Luke Zettlemoyer](https://www.cs.washington.edu/people/faculty/lsz/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/metaseq?style=social)](https://github.com/facebookresearch/metaseq/tree/main/projects/OPT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.01068), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1906.02243), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.10350), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.11990)</li><li>[blog post](https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/Megatron-LM)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ejg0OunCi9U)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14wnxMvD9zsiBQo2FtTpxn6w2cpXCcb-7) | 29.06.2022 |
    • Victor Sanh - Jian Jiang](https://github.com/tianjianjiang)</li> <li>[Matteo Manica](https://github.com/drugilsberg)</li> <li>[Sheng Shen](https://sincerass.github.io/)</li> <li>[Zheng Xin Yong](https://yongzx.github.io/)</li> <li>[Harshit Pandey](https://scholar.google.com/citations?user=BPIs78gAAAAJ)</li> <li>[Rachel Bawden](https://rbawden.github.io/)</li> <li>[Trishala Neeraj](https://github.com/trishalaneeraj)</li> <li>[Jos Rozen](https://scholar.google.com/citations?user=OxEDKogAAAAJ)</li> <li>[Abheesht Sharma](https://github.com/abheesht-sharma)</li> <li>[Andrea Santilli](https://teelinsan.github.io/)</li> <li>[Thibault Fevry](http://thibaultfevry.com/)</li> <li>[Jason Alan Fries](https://web.stanford.edu/~jfries/)</li> <li>[Ryan Teehan](https://github.com/rteehas)</li> <li>[Stella Biderman](https://www.stellabiderman.com/)</li> <li>[Leo Gao](https://github.com/leogao2)</li> <li>[Tali Bers](https://github.com/tbers-coursera)</li> <li>[Thomas Wolf](https://thomwolf.io/)</li> <li>[Alexander M. Rush](https://scholar.google.com/citations?user=LIjnUGgAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/bigscience-workshop/promptsource?style=social)](https://github.com/bigscience-workshop/promptsource) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.08207)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iJ0IVZgGjTM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YToXXfrIu6w)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1xx7SgdLaAu23YFBirXmaQViDr8caowX_) | 29.05.2022 |
    • Bowen Cheng - schwing.de/)</li> <li>[Alexander Kirillov](https://alexander-kirillov.github.io/)</li> <li>[Rohit Girdhar](https://rohitgirdhar.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135) [![](https://img.shields.io/github/stars/facebookresearch/Mask2Former?style=social)](https://github.com/facebookresearch/Mask2Former) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01527), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10764)</li><li>[demo](https://replicate.com/facebookresearch/mask2former)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/MaskFormer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/Mask2Former)</li><li>[project](https://bowenc0221.github.io/mask2former/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1uIWE5KbGFSjrxey2aRd5pWkKNY1_SaNq) | 09.02.2022 |
    • Badour AlBahar - lu/)</li> <li>[Jimei Yang](https://github.com/jimeiyang)</li> <li>[Zhixin Shu](https://zhixinshu.github.io/)</li><details><summary>others</summary><li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/BadourAlBahar/pose-with-style?style=social)](https://github.com/BadourAlBahar/pose-with-style) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.06166)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://pose-with-style.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/d_ETeAVLilw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tg-bomze/collection-of-notebooks/blob/master/HomeStylist.ipynb) | 19.01.2022 |
    • Zhuang Liu - Yuan Wu](https://chaoyuan.org/)</li> <li>[Christoph Feichtenhofer](https://feichtenhofer.github.io/)</li><details><summary>others</summary><li>[Trevor Darrell](https://people.eecs.berkeley.edu/~trevor/)</li> <li>[Saining Xie](https://www.sainingxie.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167) [![](https://img.shields.io/github/stars/facebookresearch/ConvNeXt?style=social)](https://github.com/facebookresearch/ConvNeXt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.03545)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/deit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/unilm/tree/master/beit)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/convnext)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QzCjXqFnWPE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/idiIllIQOfU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QqejV0LNDHA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1CBYTIZ4tBMsVL5cqu9N_-Q3TBprqsfEO) | 19.01.2022 |
    • Alex Nichol - QMwAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/openai/glide-text2im?style=social)](https://github.com/openai/glide-text2im) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10741)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ItKi3h7IY2o)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/glide-text2im/blob/master/notebooks/inpaint.ipynb) | 22.12.2021 |
    • Yifu Zhang - fmh.github.io/)</li><details><summary>others</summary><li>[Ping Luo](http://luoping.me/)</li> <li>[Xinggang Wang](https://xinggangw.info/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1) [![](https://img.shields.io/github/stars/ifzhang/ByteTrack?style=social)](https://github.com/ifzhang/ByteTrack) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.06864)</li><li>[data](https://motchallenge.net/), [data](https://www.crowdhuman.org/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Megvii-BaseDetection/YOLOX), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ifzhang/FairMOT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PeizeSun/TransTrack), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/samylee/Towards-Realtime-MOT-Cpp)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/multi-object-tracking)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1bDilg4cmXFa8HCKHbsZ_p16p0vrhLyu0) | 30.10.2021 |
    • Elad Richardson - alaluf.github.io/)</li> <li>[Yotam Nitzan](https://yotamnitzan.github.io/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00232)](https://doi.org/10.1109/CVPR46437.2021.00232) [![](https://img.shields.io/github/stars/eladrich/pixel2style2pixel?style=social)](https://github.com/eladrich/pixel2style2pixel) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2008.00951)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/HuangYG123/CurricularFace)</li><li>[project](https://eladrich.github.io/pixel2style2pixel/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bfvSwhqsTgM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eladrich/pixel2style2pixel/blob/master/notebooks/inference_playground.ipynb) | 01.06.2021 |
    • Yang Song - Dickstein](http://www.sohldickstein.com/)</li> <li>[Diederik Kingma](http://dpkingma.com/)</li> <li>[Abhishek Kumar](https://abhishek.umiacs.io/)</li><details><summary>others</summary><li>[Stefano Ermon](https://cs.stanford.edu/~ermon/)</li> <li>[Ben Poole](https://cs.stanford.edu/~poole/)</li></ul></details> | [![](https://img.shields.io/github/stars/yang-song/score_sde?style=social)](https://github.com/yang-song/score_sde) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.13456), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.05600), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.09011), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.11239)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/yang-song/score_sde_pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/ml_collections)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L9ZegT87QK8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/yang-song/score_sde/blob/main/Score_SDE_demo.ipynb) | 18.03.2021 |
    • Peng Zheng - Ping Fan](https://dengpingfan.github.io/)</li> <li>[Li Liu](https://scholar.google.com/citations?user=9cMQrVsAAAAJ)</li><details><summary>others</summary><li>[Jorma Laaksonen](https://scholar.google.com/citations?user=qQP6WXIAAAAJ)</li> <li>[Wanli Ouyang](https://wlouyang.github.io/)</li> <li>[Nicu Sebe](https://disi.unitn.it/~sebe/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.26599/AIR.2024.9150038)](https://doi.org/10.26599/AIR.2024.9150038) [![](https://img.shields.io/github/stars/ZhengPeng7/BiRefNet?style=social)](https://github.com/ZhengPeng7/BiRefNet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2401.03407), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.14485)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/d9NN5sgFrq)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Kazuhito00/BiRefNet-ONNX-Sample), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ZHO-ZHO-ZHO/ComfyUI-BiRefNet-ZHO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/viperyl/ComfyUI-BiRefNet)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/ZhengPeng7/BiRefNet_demo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ZhengPeng7/BiRefNet)</li><li>[project](https://www.birefnet.top/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/dichotomous-image-segmentation-on-dis-te1?p=bilateral-reference-for-high-resolution), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/camouflaged-object-segmentation-on-cod?p=bilateral-reference-for-high-resolution), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/rgb-salient-object-detection-on-davis-s?p=bilateral-reference-for-high-resolution)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1B6aKZ3ekcvKMkSBn0N5mCASLUYMp0whK) | 23.08.2024 |
    • Nikos Kolotouros - badge.php?doi=10.1109/ICCV.2019.00234)](https://doi.org/10.1109/ICCV.2019.00234) [![](https://img.shields.io/github/stars/nkolot/SPIN?style=social)](https://github.com/nkolot/SPIN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.12828)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/chaneyk/spin)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplify-x), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CMU-Perceptual-Computing-Lab/openpose)</li><li>[project](https://www.nikoskolot.com/projects/spin/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1uH2JtavOtDrFl6RsipyIncCSr19GWW4x) | 21.08.2024 |
    • Ao Wang - MIG/yolov10?style=social)](https://github.com/THU-MIG/yolov10) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2405.14458)</li><li>[blog post](https://learnopencv.com/yolov10/)</li><li>[demo](https://openbayes.com/console/public/tutorials/im29uYrnIoz)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rlggyp/YOLOv10-OpenVINO-CPP-Inference), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Seeed-Projects/jetson-examples/blob/main/reComputer/scripts/yolov10/README.md), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kaylorchen/rk3588-yolo-demo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/yolov10-optimization/yolov10-optimization.ipynb), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/sujanshresstha/YOLOv10_DeepSORT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CVHub520/X-AnyLabeling), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/DanielSarmiento04/yolov10cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lyuwenyu/RT-DETR)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/collections/jameslahm/yolov10-665b0d90b0b5bb85129460c2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/jameslahm/YOLOv10), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/kadirnar/Yolov10), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Xenova/yolov10-web)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@batuhansenerr/yolov10-custom-object-detection-bd7298ddbfd3), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@sunidhi.ashtekar/yolov10-revolutionizing-real-time-object-detection-72ef04ad441a)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/GPTFutureScience/comments/1d34rj1/yolov10_the_future_of_realtime_object_detection/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/29tnSxhB3CY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2ZFJbeJXXDM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/wM6nO75keOQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/train-yolov10-object-detection-on-custom-dataset.ipynb) | 20.08.2024 |
    • Xiaoyang Kang - Ouyang/)</li> <li>[Peiran Ren](https://scholar.google.com/citations?user=x5dEuxsAAAAJ)</li><details><summary>others</summary><li>[Lingzhi Li](https://lingzhili.com/)</li> <li>[Xuansong Xie](https://github.com/xungie)</li></ul></details> | [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.11613)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jixiaozhong/ColorFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/KIMGEONUNG/BigColor)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DDColor-colab/blob/main/DDColor_colab.ipynb) | 15.01.2024 |
    • Wenquan Lu - chaoyue.github.io/)</li> <li>[Dacheng Tao](https://scholar.google.com/citations?user=RwlJNLcAAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.17957)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fannovel16/comfyui_controlnet_aux), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mikubill/sd-webui-controlnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/MeshGraphormer)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1881z4v/handrefiner_refining_malformed_hands_in_generated/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Tt-Fyn1RA6c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/HandRefiner-colab/blob/main/HandRefiner_colab.ipynb) | 08.01.2024 |
    • Evonne Ng - 3T3LaO3nlN6R8s6pPvVNAk5mdK) | 08.01.2024 |
    • Haotian Liu - li.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.08485), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.03744), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.00890), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.09958), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.14895)</li><li>[demo](https://llava.hliu.cc/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp/pull/3436), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/LLaVA-Med), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Luodian/Otter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/liuhaotian/LLaVA-Pretrain), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/liuhaotian/LLaVA-Pretrained-Projectors)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://xthemadgenius.medium.com/how-to-use-llava-large-language-and-vision-assistant-732c666b5ed0)</li><li>[project](https://llava-vl.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mkI7EPD1vp8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kx1VpI6JzsY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RxBSmbdJ1I8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mdYycY4lsuE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/t7I46dxfmWs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KRAQkJC-XJU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/LLaVA-colab/blob/main/LLaVA_13b_4bit_vanilla_colab.ipynb) | 22.12.2023 |
    • Zhongcong Xu - CYYAAAAJ)</li> <li>[Hanshu Yan](https://hanshuyan.github.io/)</li><details><summary>others</summary><li>[Jiawei Liu](https://jia-wei-liu.github.io/)</li> <li>[Chenxu Zhang](https://zhangchenxu528.github.io/)</li> <li>[Jiashi Feng](https://sites.google.com/site/jshfeng/home)</li> <li>[Mike Shou](https://sites.google.com/view/showlab)</li></ul></details> | [![](https://img.shields.io/github/stars/magic-research/magic-animate?style=social)](https://github.com/magic-research/magic-animate) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.16498)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/zcxu-eric/MagicAnimate), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/runwayml/stable-diffusion-v1-5), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/sd-vae-ft-mse)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@AIWorldBlog/revolutionizing-image-animation-with-magicanimate-technology-78cc94151915)</li><li>[project](https://showlab.github.io/magicanimate/)</li><li>[website](https://www.magicanimate.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/td27SyA9M80), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1pATjLFvNtY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HeXknItbMM8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/MagicAnimate-colab/blob/main/MagicAnimate_colab.ipynb) | 18.12.2023 |
    • Rohit Gandikota - sliders-lora-adaptors-for-precise-control-in-diffusion-models-b7f6b36fabee)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/49856ed476ad01fcff881d57e161d73f-Abstract.html)</li><li>[project](https://sliders.baulab.info/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/180zon7/concept_sliders_lora_adaptors_for_precise_control/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/rohitgandikota/sliders/blob/main/demo_concept_sliders.ipynb) | 26.11.2023 |
    • Yang Zhou - shechtman/)</li> <li>[Jose Echevarria](http://www.jiechevarria.com/)</li><details><summary>others</summary><li>[Evangelos Kalogerakis](https://people.cs.umass.edu/~kalo/)</li> <li>[Dingzeyu Li](https://dingzeyu.li/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3414685.3417774)](https://doi.org/10.1145/3414685.3417774) [![](https://img.shields.io/github/stars/yzhou359/MakeItTalk?style=social)](https://github.com/yzhou359/MakeItTalk) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.12992)</li><li>[data](https://drive.google.com/drive/folders/1EwuAy3j1b9Zc1MsidUfxG_pJGc_cV60O)</li><li>[project](https://people.umass.edu/~yangzhou/MakeItTalk/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=vUMGKASgbf8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/iboyles/makeittalknow/blob/main/working_quick_demo_of_makeittalk_07_2023.ipynb) | 27.07.2023 |
    • chervonij - badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628) [![](https://img.shields.io/github/stars/iperov/DeepFaceLab?style=social)](https://github.com/iperov/DeepFaceLab) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05535)</li><li>[guide](https://mrdeepfakes.com/forums/thread-guide-deepfacelab-google-colab-tutorial)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCTKBl8kB6DJ_qLnk1NGDGbQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/chervonij/DFL-Colab/blob/master/DFL_Colab.ipynb) | 30.04.2023 |
    • Levon Khachatryan - hen)</li><details><summary>others</summary><li>[Zhangyang Wang](https://www.ece.utexas.edu/people/faculty/atlas-wang)</li> <li>[Shant Navasardyan](https://scholar.google.com/citations?user=VJSh59sAAAAJ)</li> <li>[Humphrey Shi](https://www.humphreyshi.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/Picsart-AI-Research/Text2Video-Zero?style=social)](https://github.com/Picsart-AI-Research/Text2Video-Zero) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.13439), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.01341), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.17604)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/dbolya/tomesd), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/JiauZhang/Text2Video-Zero), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/camenduru/text2video-zero-colab), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SHI-Labs/Text2Video-Zero-sd-webui)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/text_to_video_zero)</li><li>[project](https://text2video-zero.github.io/)</li><li>[video](https://www.dropbox.com/s/uv90mi2z598olsq/Text2Video-Zero.MP4)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/beeDJJz-Q0A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/97-1GYPtz0M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/text2video-zero-colab/blob/main/text2video_all.ipynb) | 11.04.2023 |
    • Alec Radford - python)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OCBZtgQGt1I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8SQV-B83tPU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nE5iVtwKerA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openai/whisper/blob/master/notebooks/LibriSpeech.ipynb) | 21.09.2022 |
    • Jingxiang Sun - 3D?style=social)](https://github.com/MrTornado24/IDE-3D) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://arxiv.org/abs/2205.15517), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/eg3d), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Kj5XY_J2Alk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/MrTornado24/IDE-3D/blob/main/inversion/notebooks/inference_playground.ipynb) | 08.09.2022 |
    • Ajay Jain - badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094) [![](https://img.shields.io/github/stars/google-research/google-research?style=social)](https://github.com/google-research/google-research/tree/master/dreamfields) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.01455), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.00677), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.13415)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ajayjain/DietNeRF), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/mipnerf)</li><li>[project](https://ajayj.com/dreamfields)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1Fke6w46tv4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1TjCWS2_Q0HJKdi9wA2OSY7avmFUQYGje) | 05.09.2022 |
    • Oran Gafni - A-Scene?style=social)](https://github.com/CasualGANPapers/Make-A-Scene) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13131)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZM06MjPdoxw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1SPyQ-epTsAOAu8BEohUokN4-b5RM_TnE) | 12.08.2022 |
    • Liangyu Chen - cfoAAAAJ)</li> <li>[Jian Sun](http://www.jiansun.org/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2) [![](https://img.shields.io/github/stars/megvii-research/NAFNet?style=social)](https://github.com/megvii-research/NAFNet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.04676), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.08714)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/image-deblurring-on-gopro?p=simple-baselines-for-image-restoration), [<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/image-denoising-on-sidd?p=simple-baselines-for-image-restoration)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dkO5AyktmBoWwxBwoKFUurIDn0m4qDXT) | 15.04.2022 |
    • Keunhong Park - Brualla](https://ricardomartinbrualla.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00581)](https://doi.org/10.1109/ICCV48922.2021.00581) [![](https://img.shields.io/github/stars/google/nerfies?style=social)](https://github.com/google/nerfies) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.12948)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/google-research/tree/master/jaxnerf)</li><li>[project](https://nerfies.github.io/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/photogrammetry/comments/k1i0ct/deformable_neural_radiance_fields_nerfies/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MrKrnHhk8IA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IDMiMKWucaI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/nerfies/blob/main/notebooks/Nerfies_Capture_Processing.ipynb) | 06.12.2021 |
    • Omer Tov - alaluf.github.io/)</li> <li>[Yotam Nitzan](https://yotamnitzan.github.io/)</li> <li>[Or Patashnik](https://orpatashnik.github.io/)</li> <li>[Daniel Cohen-Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459838)](https://doi.org/10.1145/3450626.3459838) [![](https://img.shields.io/github/stars/omertov/encoder4editing?style=social)](https://github.com/omertov/encoder4editing) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.02766)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/pixel2style2pixel)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/omertov/encoder4editing/blob/master/notebooks/inference_playground.ipynb) | 02.12.2021 |
    • Asher Trockman - cifar10), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codex/an-overview-on-convmixer-patches-are-all-you-need-8502a8d87011)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Gl0s0GDqN3c?t=990)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/locuslab/convmixer/blob/main/pytorch-image-models/notebooks/EffResNetComparison.ipynb) | 05.10.2021 |
    • Pramook Khungurn - head-anime-demo?style=social)](https://github.com/pkhungurn/talking-head-anime-demo) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lincolnhard/head-pose-estimation)</li><li>[project](https://pkhungurn.github.io/talking-head-anime/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Virtual_YouTuber), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/MikuMikuDance)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kMQCERkTdO0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/T1Gp-RxFZwU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FioRJ6x_RbI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/pkhungurn/talking-head-anime-demo/blob/master/tha_colab.ipynb) | 23.02.2021 |
    • Hang Zhang - badge.php?doi=10.1007/978-3-030-11018-5_32)](https://doi.org/10.1007/978-3-030-11018-5_32) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.06953)</li><li>[project](http://computervisionrutgers.github.io/MSG-Net/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=oy6pWNWBt4Y)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/zhanghang1989/PyTorch-Multi-Style-Transfer/blob/master/msgnet.ipynb) | 25.01.2021 |
    • Tianyi Zhang
    • Noah Hollmann - freiburg.de/profile/hutter/)</li></ul> | [![](https://img.shields.io/github/stars/automl/TabPFN?style=social)](https://github.com/automl/TabPFN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.01848), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.11189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.01342), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.03253), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.11189), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10510)</li><li>[blog post](https://www.automl.org/tabpfn-a-transformer-that-solves-small-tabular-classification-problems-in-a-second/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/tunguz/status/1578730907711655937)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BGTO5N5-ack)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/194mCs6SEPEW6C0rcP7xWzcEtt1RBc8jJ) | 29.11.2023 |
    • Ziwei Luo - uir?style=social)](https://github.com/Algolzw/daclip-uir) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.01018)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Algolzw/image-restoration-sde)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/weblzw/daclip-uir-ViT-B-32-irsde)</li><li>[project](https://algolzw.github.io/daclip-uir/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/daclip-uir-colab/blob/main/daclip_uir_gradio_colab.ipynb) | 11.10.2023 |
    • Denis Korzhenkov - badge.php?doi=10.1109/CVPR42600.2020.00751)](https://doi.org/10.1109/CVPR42600.2020.00751) [![](https://img.shields.io/github/stars/saic-mdal/HiDT?style=social)](https://github.com/saic-mdal/HiDT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.08791)</li><li>[project](https://saic-mdal.github.io/HiDT/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLuvGzlEQXT1KQuKrfBBEWh2f3PToxyeM5), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=EWKAgwgqXB4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/saic-mdal/hidt/blob/master/notebooks/HighResolutionDaytimeTranslation.ipynb) | 24.07.2023 |
    • Alexander Kirillov - rolland-223135a/)</li> <li>[Laura Gustafson](https://scholar.google.com/citations?user=c8IpF9gAAAAJ)</li> <li>[Tete Xiao](https://tetexiao.com/)</li> <li>[Spencer Whitehead](https://www.spencerwhitehead.com/)</li> <li>[Alex Berg](http://acberg.com/)</li> <li>[Wan-Yen Lo](https://github.com/wanyenlo)</li> <li>[Piotr Dollar](https://pdollar.github.io/)</li> <li>[Ross Girshick](https://www.rossgirshick.info/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/segment-anything?style=social)](https://github.com/facebookresearch/segment-anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.02643)</li><li>[blog post](https://ai.facebook.com/research/publications/segment-anything/), [blog post](https://ai.facebook.com/blog/segment-anything-foundation-model-image-segmentation/)</li><li>[data](https://ai.facebook.com/datasets/segment-anything/)</li><li>[website](https://segment-anything.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2O_vecl28OA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fVeW9a6wItM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FjYE0tKWOiY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/segment-anything/blob/main/notebooks/predictor_example.ipynb) | 10.04.2023 |
    • Kun Cheng - badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399) [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.14758)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/donydchen/ganimation_replicate), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RenYurui/PIRender), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/OpenTalker/StyleHEAT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FeiiYin/SPI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Mael-zys/T2M-GPT)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://xthemadgenius.medium.com/making-videos-talk-right-syncing-lips-with-sound-using-videoretalking-611428084bbc)</li><li>[project](https://opentalker.github.io/video-retalking/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/178krha/videoretalking/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/pttsTrQ-fko), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2Lkw8AmmRn0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RJ8YK_K4Ne0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vinthony/video-retalking/blob/main/quick_demo.ipynb) | 19.03.2023 |
    • Jay Zhangjie Wu - F69UAAAAJ)</li> <li>[Mike Zheng Shou](https://sites.google.com/view/showlab)</li></ul></details> | [![](https://img.shields.io/github/stars/showlab/Tune-A-Video?style=social)](https://github.com/showlab/Tune-A-Video) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.11565), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10752)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Tune-A-Video-library), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai/stable-diffusion-2-1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sd-dreambooth-library)</li><li>[project](https://tuneavideo.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uzF6CTtjn-g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uUlp1_ExsGQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/showlab/Tune-A-Video/blob/main/notebooks/Tune-A-Video.ipynb) | 23.02.2023 |
    • Max Ingham - diffusion?style=social)](https://github.com/alembics/disco-diffusion) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/guided-diffusion)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_DtWfh9oS54), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gWxmtdZL8FE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yVJB6oD0_gM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/alembics/disco-diffusion/blob/main/Disco_Diffusion.ipynb) | 11.02.2023 |
    • Matthew Tancik - Keil](https://people.eecs.berkeley.edu/~sfk/)</li><details><summary>others</summary><li>[Nithin Raghavan](https://cseweb.ucsd.edu//~n2raghavan/)</li> <li>[Utkarsh Singhal](https://scholar.google.com/citations?user=lvA86MYAAAAJ)</li> <li>[Ravi Ramamoorthi](https://cseweb.ucsd.edu//~ravir/)</li> <li>[Jon Barron](https://jonbarron.info/)</li> <li>[Ren Ng](https://www2.eecs.berkeley.edu/Faculty/Homepages/yirenng.html)</li></ul></details> | [![](https://img.shields.io/github/stars/tancik/fourier-feature-networks?style=social)](https://github.com/tancik/fourier-feature-networks) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1806.07572)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2020/hash/55053683268957697aa39fba6f231c68-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2007/hash/013a006f03dbc5392effeb8f18fda755-Abstract.html)</li><li>[project](https://bmild.github.io/fourfeat/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nVA6K6Sn2S4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tancik/fourier-feature-networks/blob/master/Demo.ipynb) | 17.01.2023 |
    • Jiefeng Li - sjtu/HybrIK?style=social)](https://github.com/Jeff-sjtu/HybrIK) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.14672)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mks0601/3DMPPE_POSENET_RELEASE)</li><li>[project](https://jeffli.site/HybrIK/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/3d-human-pose-estimation-on-3dpw?p=hybrik-a-hybrid-analytical-neural-inverse)</li><li>[supp](https://openaccess.thecvf.com/content/CVPR2021/supplemental/Li_HybrIK_A_Hybrid_CVPR_2021_supplemental.zip)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tvwnXXH7xIw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1n41l7I2NxWseuruVQEU8he2XqzSXhu2f) | 01.01.2023 |
    • Mingyuan Zhang - zhang/MotionDiffuse?style=social)](https://github.com/mingyuan-zhang/MotionDiffuse) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2208.15001)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/mingyuan/MotionDiffuse)</li><li>[project](https://mingyuan-zhang.github.io/projects/MotionDiffuse.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U5PTnw490SA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Dp6VsZp2ozKuu9ccMmsDjyij_vXfCYb3) | 13.10.2022 |
    • William Peebles - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Richard Zhang](https://richzhang.github.io/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li><details><summary>others</summary><li>[Alexei Efros](https://people.eecs.berkeley.edu/~efros/)</li> <li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li></ul></details> | [![](https://img.shields.io/github/stars/wpeebles/gangealing?style=social)](https://github.com/wpeebles/gangealing) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.05143)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/nileshkulkarni/acsm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://jitengmu.github.io/CoordGAN/)</li><li>[project](https://www.wpeebles.com/gangealing)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Qa1ASS_NuzE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qtOkktTNs-k)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JkUjhTjR8MyLxwarJjqnh836BICfocTu) | 01.09.2022 |
    • Liunian Harold Li - zhang.github.io/)</li> <li>[Jianwei Yang](https://jwyang.github.io/)</li><details><summary>others</summary><li>[Chunyuan Li](https://chunyuan.li/)</li> <li>[Yiwu Zhong](https://pages.cs.wisc.edu/~yiwuzhong/)</li> <li>[Lijuan Wang](https://github.com/LijuanWang)</li> <li>[Lu Yuan](https://scholar.google.com/citations?user=k9TsUVsAAAAJ)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li> <li>[Jenq-Neng Hwang](https://people.ece.uw.edu/hwang/)</li> <li>[Kai-Wei Chang](http://web.cs.ucla.edu/~kwchang/)</li> <li>[Jianfeng Gao](https://www.microsoft.com/en-us/research/people/jfgao/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069) [![](https://img.shields.io/github/stars/microsoft/GLIP?style=social)](https://github.com/microsoft/GLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.03857), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.05836), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.01066), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.08790)</li><li>[blog post](https://www.microsoft.com/en-us/research/project/project-florence-vl/articles/object-detection-in-the-wild-via-grounded-language-image-pre-training/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/gligen/GLIGEN)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/harold/GLIP)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://sh-tsang.medium.com/glip-grounded-language-image-pre-training-2be2483295b3), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/glip-introducing-language-image-pre-training-to-object-detection-5ddb601873aa)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zu1BGQBI4dU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12x7v-_miN7-SRiziK3Cx4ffJzstBJNqb) | 30.07.2022 |
    • Rohit Girdhar - joulin/)</li> <li>[Ishan Misra](https://imisra.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01563)](https://doi.org/10.1109/CVPR52688.2022.01563) [![](https://img.shields.io/github/stars/facebookresearch/omnivore?style=social)](https://github.com/facebookresearch/omnivore) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.08377), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.08356)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/akhaliq/omnivore)</li><li>[project](https://facebookresearch.github.io/omnivore/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/epic-kitchens-100)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/omnivore/blob/main/inference_tutorial.ipynb) | 14.06.2022 |
    • Sen He - hannover.de/en/staff/liao/)</li> <li>[Michael Yang](https://sites.google.com/site/michaelyingyang/)</li> <li>[Yi-Zhe Song](http://personal.ee.surrey.ac.uk/Personal/Y.Song/)</li><details><summary>others</summary><li>[Bodo Rosenhahn](https://scholar.google.com/citations?user=qq3TxtcAAAAJ)</li> <li>[Tao Xiang](http://personal.ee.surrey.ac.uk/Personal/T.Xiang/index.html)</li></ul></details> | [![](https://img.shields.io/github/stars/SenHe/DLFS?style=social)](https://github.com/SenHe/DLFS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.02874)</li><li>[project](https://senhe.github.io/projects/iccv_2021_lifespan_face/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=uklX03ns0m0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fgVAoxCSaqPkj0rUK4RmBh7GTQRqLNpE) | 22.02.2022 |
    • Felix Petersen - konstanz.de/personen/prof-dr-oliver-deussen/)</li></ul> | [![](https://img.shields.io/github/stars/Felix-Petersen/diffsort?style=social)](https://github.com/Felix-Petersen/diffsort) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.04019), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.09630)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Rl-sFaE1z4M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1q0TZFFYB9FlOJYWKt0_7ZaXQT190anhm) | 17.01.2022 |
    • Chrisantha Fernando - Baptiste Alayrac](https://www.jbalayrac.com/)</li> <li>[Piotr Mirowski](https://piotrmirowski.com/)</li><details><summary>others</summary><li>[Dylan Banarse](https://www.2ne1.com/)</li> <li>[Simon Osindero](https://scholar.google.com/citations?user=Jq8ZS5kAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/arnheim?style=social)](https://github.com/deepmind/arnheim) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.00162), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.14843), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.07729), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.02580), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1609.09106)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/dall-e)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Compositional_pattern-producing_network)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=U7guaMdeF4g), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=zh0goLbS-l0), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=SYJGNt7yu6M), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=MxkYKa0x5AU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/arnheim/blob/master/arnheim_2.ipynb) | 11.11.2021 |
    • Mikael Christensen - badge.php?doi=10.1109/CVPR42600.2020.00813)](https://doi.org/10.1109/CVPR42600.2020.00813) [![](https://img.shields.io/github/stars/NVlabs/stylegan2?style=social)](https://github.com/NVlabs/stylegan2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](http://arxiv.org/abs/1912.04958)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/c-NJtV9Jvp0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1ShgW6wohEFQtqs_znMna3dzrcVoABKIH) | 05.11.2021 |
    • Ziyu Wan - zhang.me/)</li> <li>[Dongdong Chen](http://www.dongdongchen.bid/)</li> <li>[Pan Zhang](https://panzhang0212.github.io/)</li><details><summary>others</summary><li>[Dong Chen](http://www.dongchen.pro/)</li> <li>[Jing Liao](https://liaojing.github.io/html/)</li> <li>[Fang Wen](https://www.microsoft.com/en-us/research/people/fangwen/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/Bringing-Old-Photos-Back-to-Life?style=social)](https://github.com/microsoft/Bringing-Old-Photos-Back-to-Life) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.09484)</li><li>[demo](https://replicate.com/microsoft/bringing-old-photos-back-to-life)</li><li>[project](http://raywzy.com/Old_Photo/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Q5bhszQq9eA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1NEm6AsybIiC5TwTU_4DqDkQO0nFRB-uA) | 13.07.2021 |
    • Suttisak Wizadwongsa - yenphraphai-990ba6175/)</li> <li>[Supasorn Suwajanakorn](https://www.supasorn.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00843)](https://doi.org/10.1109/CVPR46437.2021.00843) [![](https://img.shields.io/github/stars/nex-mpi/nex-code?style=social)](https://github.com/nex-mpi/nex-code) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.05606)</li><li>[data](https://vistec-my.sharepoint.com/personal/pakkapon_p_s19_vistec_ac_th/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fpakkapon%5Fp%5Fs19%5Fvistec%5Fac%5Fth%2FDocuments%2Fpublic%2FVLL%2FNeX%2Fshiny%5Fdatasets&originalPath=aHR0cHM6Ly92aXN0ZWMtbXkuc2hhcmVwb2ludC5jb20vOmY6L2cvcGVyc29uYWwvcGFra2Fwb25fcF9zMTlfdmlzdGVjX2FjX3RoL0VuSVVoc1JWSk9kTnNaXzRzbWRoeWUwQjh6MFZseHFPUjM1SVIzYnAwdUd1cFE%5FcnRpbWU9WXRVQTQtQTcyVWc), [data](https://vistec-my.sharepoint.com/personal/pakkapon_p_s19_vistec_ac_th/_layouts/15/onedrive.aspx?originalPath=aHR0cHM6Ly92aXN0ZWMtbXkuc2hhcmVwb2ludC5jb20vOmY6L2cvcGVyc29uYWwvcGFra2Fwb25fcF9zMTlfdmlzdGVjX2FjX3RoL0VyalBSUkw5Sm5GSXA4TU42ZDFqRXVvQjNYVm94SmtmZlBqZm9QeWhIa2owZGc%5FcnRpbWU9bC0yYWctRTcyVWc&id=%2Fpersonal%2Fpakkapon%5Fp%5Fs19%5Fvistec%5Fac%5Fth%2FDocuments%2Fpublic%2FVLL%2FNeX%2Fmodified%5Fdataset)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fyusion/LLFF)</li><li>[project](https://nex-mpi.github.io/)</li><li>[vistec](https://vistec.ist/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=HyfkF7Z-ddA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1hXVvYdAwLA0EFg2zrafJUE0bFgB_F7PU) | 25.03.2021 |
    • Alexander Kolesnikov - badge.php?doi=10.1007/978-3-030-58558-7_29)](https://doi.org/10.1007/978-3-030-58558-7_29) [![](https://img.shields.io/github/stars/google-research/big_transfer?style=social)](https://github.com/google-research/big_transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.11370), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.05237)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/google/bit-50)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://sh-tsang.medium.com/review-big-transfer-bit-general-visual-representation-learning-cb4bf8ed9732)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/k1GOF2jmX7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0iTgt5-SOsU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X5Rhm__OxvA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/big_transfer/blob/master/colabs/big_transfer_tf2.ipynb) | 12.11.2020 |
    • Kaiming He - badge.php?doi=10.1109/CVPR42600.2020.00975)](https://doi.org/10.1109/CVPR42600.2020.00975) [![](https://img.shields.io/github/stars/facebookresearch/moco?style=social)](https://github.com/facebookresearch/moco) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.05722), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.04297), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.02677)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ppwwyyxx/moco.tensorflow)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LvHwBQF14zs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4VVGtYPM8JE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/o5Qh61dLDf0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/moco/blob/colab-notebook/colab/moco_cifar10_demo.ipynb) | 20.08.2020 |
    • Ryota Natsume - badge.php?doi=10.1109/ICCV.2019.00239)](https://doi.org/10.1109/ICCV.2019.00239) [![](https://img.shields.io/github/stars/shunsukesaito/PIFu?style=social)](https://github.com/shunsukesaito/PIFu) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.05172)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=S1FpjwKqtPs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1GFSsqP2BWz4gtq0e-nki00ZHSirXwFyY) | 08.10.2024 |
    • Tomasz Latkowski - gmcnn-keras?style=social)](https://github.com/tlatkowski/inpainting-gmcnn-keras) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.08771)</li><li>[data](http://places2.csail.mit.edu/download.html), [data](https://nv-adlr.github.io/publication/partialconv-inpainting)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/keras-team/keras-contrib/blob/master/examples/improved_wgan.py)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper_files/paper/2018/hash/6f3ef77ac0e3619e98159e9b6febf557-Abstract.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tlatkowski/inpainting-gmcnn-keras/blob/master/colab/Image_Inpainting_with_GMCNN_model.ipynb) | 09.08.2019 |
    • Andrew Brock - research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/nfnets) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06171), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.08692)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/jaxline)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rNkHjZtH0RQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/qyy2WhRRSI4?feature=share)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind-research/blob/master/nfnets/nfnet_demo_colab.ipynb) | 17.02.2021 |
    • Yu-Lun Liu - Sheng Lai](https://www.wslai.net/)</li> <li>[Ming-Hsuan Yang](https://faculty.ucmerced.edu/mhyang/)</li> <li>[Yung-Yu Chuang](https://www.csie.ntu.edu.tw/~cyy/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00230)](https://doi.org/10.1109/ICCV48922.2021.00230) [![](https://img.shields.io/github/stars/alex04072000/NeRViS?style=social)](https://github.com/alex04072000/NeRViS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.06205)</li><li>[data](http://liushuaicheng.org/SIGGRAPH2013/database.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cxjyxxme/deep-online-video-stabilization), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jinsc37/DIFRINT)</li><li>[project](https://alex04072000.github.io/NeRViS/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KO3sULs4hso)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1l-fUzyM38KJMZyKMBWw_vu7ZUyDwgdYH) | 11.04.2021 |
    • John Jumper - freiburg.de/people/ronneber/)</li> <li>[Kathryn Tunyasuvunakool](https://scholar.google.com/citations?user=eEqNGagAAAAJ)</li> <li>[Russ Bates](https://scholar.google.com/citations?user=Koes5ewAAAAJ)</li> <li>[Augustin Žídek](https://augustin.zidek.eu/)</li> <li>[Anna Potapenko](http://apotapenko.com/)</li> <li>[Alex Bridgland](https://scholar.google.com/citations?user=VWmXKPMAAAAJ)</li> <li>[Clemens Meyer](https://scholar.google.com/citations?user=EWLZiM8AAAAJ)</li> <li>[Simon Kohl](https://www.simonkohl.com/)</li> <li>[Andrew Ballard](https://scholar.google.com/citations?user=syjQhAMAAAAJ)</li> <li>[Bernardino Romera-Paredes](https://sites.google.com/site/romeraparedes/)</li> <li>[Stanislav Nikolov](https://scholar.google.co.uk/citations?user=O-b7pBEAAAAJ)</li> <li>[Rishub Jain](http://rishub.me/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2) [![](https://img.shields.io/github/stars/deepmind/alphafold?style=social)](https://github.com/deepmind/alphafold/) <ul><li>[blog post](https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology), [blog post](https://deepmind.com/blog/article/putting-the-power-of-alphafold-into-the-worlds-hands)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/tree), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/chex)</li><li>[paper](https://www.nature.com/articles/s41586-021-03828-1)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/alphafold)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/AlphaFold)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=gg7WjuFs8F4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=B9PL__gVxLI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/alphafold/blob/master/notebooks/AlphaFold.ipynb) | 15.04.2024 |
    • Maxime Oquab - Nouby](https://aelnouby.github.io/)</li> <li>[Mahmoud Assran](http://www.midoassran.ca/)</li> <li>[Nicolas Ballas](https://scholar.google.com/citations?user=euUV4iUAAAAJ)</li> <li>[Wojciech Galuba](https://scholar.google.com/citations?user=jyaTX64AAAAJ)</li> <li>[Russell Howes](http://www.russellhowes.net/)</li> <li>[Po-Yao Huang](https://berniebear.github.io/)</li> <li>[Shang-Wen Li](https://swdanielli.github.io/)</li> <li>[Ishan Misra](http://imisra.github.io/)</li> <li>[Michael Rabbat](https://scholar.google.com/citations?user=cMPKe9UAAAAJ)</li> <li>[Vasu Sharma](https://vasusharma.github.io/)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Hu Xu](https://howardhsu.github.io/)</li> <li>[Hervé Jegou](https://github.com/jegou)</li> <li>[Julien Mairal](http://thoth.inrialpes.fr/people/mairal/)</li> <li>[Patrick Labatut](https://github.com/patricklabatut)</li> <li>[Armand Joulin](https://scholar.google.com/citations?user=kRJkDakAAAAJ)</li> <li>[Piotr Bojanowski](https://github.com/piotr-bojanowski)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/dinov2?style=social)](https://github.com/facebookresearch/dinov2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.07193)</li><li>[blog post](https://ai.facebook.com/blog/dino-v2-computer-vision-self-supervised-learning/)</li><li>[demo](https://dinov2.metademolab.com/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/model_doc/dinov2)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://purnasaigudikandula.medium.com/dinov2-image-classification-visualization-and-paper-review-745bee52c826), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/meta-ais-another-revolutionary-large-scale-model-dinov2-for-image-feature-extraction-1114b287eadd)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/csEgtSh7jV4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/KSZiJ4k28b4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RZEkdOc3szU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/dinov2/blob/main/notebooks/semantic_segmentation.ipynb) | 31.08.2023 |
    • Tero Karras - Aittala)</li> <li>[Samuli Laine](https://research.nvidia.com/person/Samuli-Laine)</li> <li>[Erik Härkönen](https://github.com/harskish)</li><details><summary>others</summary><li>[Janne Hellsten](https://research.nvidia.com/person/Janne-Hellsten)</li> <li>[Jaakko Lehtinen](https://users.aalto.fi/~lehtinj7/)</li> <li>[Timo Aila](https://research.nvidia.com/person/timo-aila)</li></ul></details> | [![](https://img.shields.io/github/stars/NVlabs/stylegan3?style=social)](https://github.com/NVlabs/stylegan3) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.12423), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08500), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.01401), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.06991), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1812.04948), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.03498)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan3-detector), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/metfaces-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/076ccd93ad68be51f23707988e934906-Abstract.html)</li><li>[project](https://nvlabs.github.io/stylegan3)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1BXNHZBai-pXtP-ncliouXo_kUiG1Pq7M) | 13.08.2023 |
    • Alexandre Défossez - Gui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kuielab/mdx-net-submission), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/f90/Wave-U-Net)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dC9nVxk3V_VPjUADsnFu8EiT-xnU1tGH) | 21.11.2022 |
    • Yung-Sung Chuang - ms.mit.edu/rumen.html)</li> <li>[Hongyin Luo](https://luohongyin.github.io/)</li> <li>[Yang Zhang](https://mitibmwatsonailab.mit.edu/people/yang-zhang/)</li><details><summary>others</summary><li>[Shiyu Chang](https://code-terminator.github.io/)</li> <li>[Marin Soljačić](http://www.mit.edu/~soljacic/marin.html)</li> <li>[Shang-Wen Li](https://swdanielli.github.io/)</li> <li>[Scott Wen-tau Yih](https://scottyih.org/)</li> <li>[Yoon Kim](https://people.csail.mit.edu/yoonkim/)</li> <li>[James Glass](http://groups.csail.mit.edu/sls/people/glass.shtml)</li></ul></details> | [![](https://img.shields.io/github/stars/voidism/diffcse?style=social)](https://github.com/voidism/diffcse) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.10298), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.08821), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.00899)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/princeton-nlp/SimCSE)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/voidism)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/YungSungChuang/status/1517518077902000129)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/voidism/DiffCSE/blob/master/diffcse_evaluation.ipynb) | 24.04.2022 |
    • Arantxa Casanova - careil-901804155)</li> <li>[Jakob Verbeek](http://thoth.inrialpes.fr/~verbeek/)</li> <li>[Michał Drożdżal](https://scholar.google.com/citations?user=XK_ktwQAAAAJ)</li> <li>[Adriana Romero-Soriano](https://sites.google.com/site/adriromsor)</li></ul> | [![](https://img.shields.io/github/stars/facebookresearch/ic_gan?style=social)](https://github.com/facebookresearch/ic_gan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.05070)</li><li>[blog post](https://ai.facebook.com/blog/instance-conditioned-gans/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/faiss), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ajbrock/BigGAN-PyTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2-ada-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/bioinf-jku/TTUR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mit-han-lab/data-efficient-gans)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://proceedings.neurips.cc/paper/2021/hash/e7ac288b0f2d41445904d071ba37aaff-Abstract.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/ic_gan/blob/master/inference/icgan_colab.ipynb) | 01.10.2021 |
    • Dmytro Kotovenko - wright.github.io/)</li> <li>[Arthur Heimbrecht](https://github.com/arwehei)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01202)](https://doi.org/10.1109/CVPR46437.2021.01202) [![](https://img.shields.io/github/stars/CompVis/brushstroke-parameterized-style-transfer?style=social)](https://github.com/CompVis/brushstroke-parameterized-style-transfer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17185)</li><li>[project](https://compvis.github.io/brushstroke-parameterized-style-transfer/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/brushstroke-parameterized-style-transfer/blob/tensorflow_v2/notebooks/BrushstrokeStyleTransfer_TF2.ipynb) | 02.06.2021 |
    • Ming Zhong - us/research/people/shuowa/)</li> <li>[Yadong Lu](https://adamlu123.github.io/)</li><details><summary>others</summary><li>[Yizhu Jiao](https://yzjiao.github.io/)</li> <li>[Siru Ouyang](https://ozyyshr.github.io/)</li> <li>[Donghan Yu](https://plusross.github.io/)</li> <li>[Jiawei Han](https://hanj.cs.illinois.edu/)</li> <li>[Weizhu Chen](https://www.microsoft.com/en-us/research/people/wzchen/)</li></ul></details> | [![](https://img.shields.io/github/stars/maszhongming/Multi-LoRA-Composition?style=social)](https://github.com/maszhongming/Multi-LoRA-Composition) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2402.16843)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@letscodeai/multi-lora-composition-for-image-generation-f2706528c590)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/ninjasaid13/comments/1b13q8s/multilora_composition_for_image_generation/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/MingZhong_/status/1762347881812443575?s=20)</li><li>[website](https://maszhongming.github.io/Multi-LoRA-Composition/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1eSTj6qGOtSY5NaazwwN3meXOzEZxgaZq) | 03.03.2024 |
    • Rémi Lam - Gonzalez](https://github.com/alvarosg)</li> <li>[Matthew Willson](https://github.com/mjwillson)</li> <li>[Peter Wirnsberger](https://pewi.org/)</li><details><summary>others</summary><li>[Meire Fortunato](https://scholar.google.com/citations?user=_fMHSIUAAAAJ)</li> <li>[Ferran Alet](https://scholar.google.com/citations?user=1lmBq3QAAAAJ)</li> <li>[Suman Ravuri](https://www.linkedin.com/in/suman-ravuri-81928082)</li> <li>[Timo Ewalds](https://github.com/tewalds)</li> <li>[Zach Eaton-Rosen](https://scholar.google.com/citations?user=mQ3zD_wAAAAJ)</li> <li>[Weihua Hu](https://weihua916.github.io/)</li> <li>[Alexander Merose](https://alex.merose.com/)</li> <li>[Stephan Hoyer](https://stephanhoyer.com/)</li> <li>[George Holland](https://www.linkedin.com/in/g-aracil-holland)</li> <li>[Oriol Vinyals](https://research.google/people/oriol-vinyals/)</li> <li>[Jacklynn Stott](https://linkedin.com/in/jacklynnstott)</li> <li>[Alexander Pritzel](https://github.com/a-pritzel)</li> <li>[Shakir Mohamed](https://www.shakirm.com/)</li> <li>[Peter Battaglia](https://scholar.google.com/citations?user=nQ7Ij30AAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336) [![](https://img.shields.io/github/stars/google-deepmind/graphcast?style=social)](https://github.com/google-deepmind/graphcast) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.12794)</li><li>[data](https://www.ecmwf.int/en/forecasts/datasets/reanalysis-datasets/era5)</li><li>[<img src="images/deepmind.svg" alt="deepmind" height=20/>](https://deepmind.google/discover/blog/graphcast-ai-model-for-faster-and-more-accurate-global-weather-forecasting/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/chex), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dask/dask), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/jaxline), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-deepmind/tree), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mikedh/trimesh)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/graphcast-how-to-get-things-done-f2fd5630c5fb)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BufUW7h9TB8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PD1v5PCJs_o), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Eul-JN9Nwb0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BTyhgp9Hugc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/aJ_H4exg0xU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/graphcast/blob/master/graphcast_demo.ipynb) | 04.01.2024 |
    • Junsong Chen - alpha/PixArt-sigma?style=social)](https://github.com/PixArt-alpha/PixArt-sigma) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2403.04692), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.00426), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2401.05252)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/rde6eaE5Ta)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/PixArt-alpha/PixArt-alpha), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/PixArt-alpha/PixArt-LCM)</li><li>[project](https://pixart-alpha.github.io/PixArt-sigma-project/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/PixArtSigma/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1jZ5UZXk7tcpTfVwnX33dDuefNMcnW9ME) | 07.11.2023 |
    • Hao-Shu Fang - hy)</li> <li>[Chao Xu](https://www.isdas.cn/)</li><details><summary>others</summary><li>[Haoyi Zhu](https://www.haoyizhu.site/)</li> <li>[Yuliang Xiu](https://xiuyuliang.cn/)</li> <li>[Yong-Lu Li](https://dirtyharrylyl.github.io/)</li> <li>[Cewu Lu](https://scholar.google.com/citations?user=QZVQEWAAAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784) [![](https://img.shields.io/github/stars/MVIG-SJTU/AlphaPose?style=social)](https://github.com/MVIG-SJTU/AlphaPose) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.03375)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tycoer/AlphaPose_jittor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Fang-Haoshu/Halpe-FullBody)</li><li>[project](https://www.mvig.org/research/alphapose.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uze6chg-YeU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Z2WPd59pRi8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qW4lb9tnA3I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_qtNzylm1XI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_3Wxi4H3QGVC28snL3rHIoeMAwI2otMR) | 07.01.2023 |
    • Muhammed Kocabas - nik)</li> <li>[Michael Black](https://ps.is.mpg.de/person/black)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00530)](https://doi.org/10.1109/CVPR42600.2020.00530) [![](https://img.shields.io/github/stars/mkocabas/VIBE?style=social)](https://github.com/mkocabas/VIBE) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.05656)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/carlosedubarreto/vibe_win_install), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/akanazawa/human_dynamics), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/MandyMo/pytorch_HMR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/soulslicer/STAF/tree/staf)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/sota/3d-human-pose-estimation-on-3dpw?p=vibe-video-inference-for-human-body-pose-and)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3qhs5IRJ1LI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w1biKeiQThY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rIr-nX63dUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fW0sIZfQcIs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8Qt0wA16kTo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xyo5gl5GLEI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XNzgUhxKC38), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hErK0MamTY4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Gfmm8uMfMq0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dFfwxZ52MN86FA6uFNypMEdFShd2euQA) | 23.12.2020 |
    • Nikita Karaev - graham/)</li> <li>[Natalia Neverova](https://nneverova.github.io/)</li><details><summary>others</summary><li>[Andrea Vedaldi](https://www.robots.ox.ac.uk/~vedaldi/)</li> <li>[Christian Rupprecht](https://chrirupp.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/co-tracker?style=social)](https://github.com/facebookresearch/co-tracker) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2307.07635), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.11898)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/benjiebob/BADJA)</li><li>[project](https://co-tracker.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w5QVc7BVGPA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/co-tracker/blob/main/notebooks/demo.ipynb) | 16.10.2024 |
    • Zongsheng Yue - ntu.com/person/ccloy/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2024.3432651)](https://doi.org/10.1109/TPAMI.2024.3432651) [![](https://img.shields.io/github/stars/zsyOAOA/DifFace?style=social)](https://github.com/zsyOAOA/DifFace) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.06512)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/ffhq-dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/improved-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepcam-cn/yolov5-face), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/OAOA/DifFace)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1BNtoPPRuJwNDvqfwDOOmD9XJyF05Zh4m) | 05.10.2024 |
    • Nikhila Ravi - Ting Hu](https://scholar.google.com/citations?user=E8DVVYQAAAAJ)</li> <li>[Ronghang Hu](https://ronghanghu.com/)</li><details><summary>others</summary><li>[Chaitanya Ryali](https://scholar.google.com/citations?user=4LWx24UAAAAJ)</li> <li>[Tengyu Ma](https://scholar.google.com/citations?user=VeTSl0wAAAAJ)</li> <li>[Haitham Khedr](https://hkhedr.com/)</li> <li>[Roman Rädle](https://scholar.google.de/citations?user=Tpt57v0AAAAJ)</li> <li>[Chloe Rolland](https://scholar.google.com/citations?hl=fr&user=n-SnMhoAAAAJ)</li> <li>[Laura Gustafson](https://scholar.google.com/citations?user=c8IpF9gAAAAJ)</li> <li>[Eric Mintun](https://ericmintun.github.io/)</li> <li>[Junting Pan](https://junting.github.io/)</li> <li>[Kalyan Vasudev](lwala](https://scholar.google.co.in/citations?user=m34oaWEAAAAJ)</li> <li>[Nicolas Carion](https://www.nicolascarion.com/)</li> <li>[Chao-Yuan](u](https://chaoyuan.org/)</li> <li>[Ross Girshick](https://www.rossgirshick.info/)</li> <li>[Piotr Dollár](https://pdollar.github.io/)</li> <li>[Christoph Feichtenhofer](https://feichtenhofer.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/segment-anything-2?style=social)](https://github.com/facebookresearch/segment-anything-2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2408.00714)</li><li>[demo](https://sam2.metademolab.com/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/zsef123/Connected_components_PyTorch)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?search=facebook/sam2)</li><li>[<img src="images/meta.svg" alt="meta" height=20/>](https://ai.meta.com/research/publications/sam-2-segment-anything-in-images-and-videos/), [<img src="images/meta.svg" alt="meta" height=20/>](https://ai.meta.com/datasets/segment-anything-video), [<img src="images/meta.svg" alt="meta" height=20/>](https://ai.meta.com/blog/segment-anything-2)</li><li>[project](https://ai.meta.com/sam2/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/AIatMeta/status/1818055906179105010)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=w-cmMcMZoZ4&t=2325s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/O8QdvZbRDp4), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/Dv003fTyO-Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IW7jFq3vQbw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/segment-anything-2/blob/main/notebooks/image_predictor_example.ipynb) | 01.10.2024 |
    • Manuel Romero - badge.php?doi=10.1109/ICCV.2019.00880)](https://doi.org/10.1109/ICCV.2019.00880) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1903.04411)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/reinforcementlearning/comments/b5lpfl/learning_to_paint_with_modelbased_deep/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=YmOgKZ5oipk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mrm8488/shared_colab_notebooks/blob/master/custom_learningtopaint.ipynb) | 01.02.2023 |
    • Adam Roberts - vae)</li><li>[project](https://magenta.tensorflow.org/music-vae)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLBUMAYA6kvGU8Cgqh709o5SUvo-zHGTxr)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/magenta/magenta-demos/blob/master/colab-notebooks/MusicVAE.ipynb) | 01.02.2023 |
    • Yujun Shen - badge.php?doi=10.1109/CVPR42600.2020.00926)](https://doi.org/10.1109/CVPR42600.2020.00926) [![](https://img.shields.io/github/stars/genforce/interfacegan?style=social)](https://github.com/genforce/interfacegan) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10786), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.09635), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10196)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tkarras/progressive_growing_of_gans), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan)</li><li>[project](https://genforce.github.io/interfacegan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=uoftpl3Bj6w)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/genforce/interfacegan/blob/master/docs/InterFaceGAN.ipynb) | 13.10.2020 |
    • Yuliang Xiu - 42.github.io/homepage/)</li> <li>[Dimitrios Tzionas](https://ps.is.mpg.de/~dtzionas)</li> <li>[Michael Black](https://ps.is.mpg.de/~black)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057) [![](https://img.shields.io/github/stars/YuliangXiu/ECON?style=social)](https://github.com/YuliangXiu/ECON) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.07422)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/Vqa7KBGRyk)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://github.com/YuliangXiu/ECON/blob/master/docs/installation-docker.md)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/kwan3854/CEB_ECON), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xucao-42/bilateral_normal_integration), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Project-Splinter/MonoPortDataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/huangyangyi/TeCH), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/huangyangyi/TeCH), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vchoutas/smplx), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/yfeng95/PIXIE)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1451sjr/econ_explicit_clothed_humans_optimized_via_normal/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/yuliangxiu)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/sbWZbTf6ZYk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SDVfCeaI4AY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5PEd_p90kS0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MDFvV7y5Qgk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1YRgwoRCZIrSB2e7auEWFyG10Xzjbrbno) | 31.05.2023 |
    • Jason Antic - robinson)</li> <li>[María Benavente](https://github.com/mariabg)</li></ul> | [![](https://img.shields.io/github/stars/jantic/DeOldify?style=social)](https://github.com/jantic/DeOldify) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1805.08318), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08500)</li><li>[model](https://data.deepai.org/deoldify/ColorizeArtistic_gen.pth)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/TheWayWeWere/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/DeOldify)</li><li>[website](https://deoldify.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jantic/DeOldify/blob/master/ImageColorizerColab.ipynb) | 19.09.2022 |
    • Zhen Li - Ze Lu](https://github.com/LGYoung)</li> <li>[Jianhua Qin](https://scholar.google.com/citations?&user=TAr7TU4AAAAJ)</li> <li>[Chun-Le Guo](https://scholar.google.com/citations?user=RZLYwR0AAAAJ)</li> <li>[Ming-Ming Cheng](https://mmcheng.net/)</li></ul> | [![](https://img.shields.io/github/stars/MCG-NKU/E2FGVI?style=social)](https://github.com/MCG-NKU/E2FGVI) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.02663)</li><li>[data](https://competitions.codalab.org/competitions/19544#participate-get-data), [data](https://data.vision.ee.ethz.ch/csergi/share/davis/DAVIS-2017-trainval-480p.zip)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/researchmm/STTN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/Focal-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ruiliu-ai/FuseFormer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/phoenix104104/fast_blind_video_consistency#evaluation)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/mlearning-ai/end-to-end-framework-for-flow-guided-video-inpainting-c5e2d8b61d20)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/N--qC3T2wc4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3eH3Fm6gOFk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12rwY2gtG8jVWlNx9pjmmM8uGmh5ue18G) | 06.04.2022 |
    • Shanchuan Lin - saleemi)</li> <li>[Soumyadip Sengupta](https://github.com/senguptaumd)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/WACV51458.2022.00319)](https://doi.org/10.1109/WACV51458.2022.00319) [![](https://img.shields.io/github/stars/PeterL1n/RobustVideoMatting?style=social)](https://github.com/PeterL1n/RobustVideoMatting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.11515)</li><li>[project](https://peterl1n.github.io/RobustVideoMatting)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/pdbpmg/r_robust_highresolution_video_matting_with/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Jvzltozpbpk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ay-mGCEYEzM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VL-0K6HjhvQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Jhuf6M_VrBI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_oN9yyRi3HY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/10z-pNKRnVNsp0Lq9tH1J_XPZ7CBC_uHm) | 24.11.2021 |
    • Jiaxiang Tang - dreamfusion?style=social)](https://github.com/ashawkey/stable-dreamfusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.14988)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ashawkey/torch-ngp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hoffstadt/DearPyGui)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/runwayml/stable-diffusion-v1-5)</li><li>[project](https://dreamfusion3d.github.io/)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/docs/stable/cpp_extension.html#torch.utils.cpp_extension.load)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uM5NPodZZ1U?t=219), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zWD5ZR5GtJM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L3G0dx1Q0R8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dIgDbBTztUM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1MXT3yfOFvO0ooKEfiUUvTKwUkrrlCHpF) | 04.04.2023 |
    • Jingyun Liang - wuerzburg.de/computervision/home/)</li> <li>[Luc Van Gool](https://scholar.google.com/citations?user=TwMib_QAAAAJ)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2024.3372454)](https://doi.org/10.1109/TIP.2024.3372454) [![](https://img.shields.io/github/stars/JingyunLiang/VRT?style=social)](https://github.com/JingyunLiang/VRT) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12288)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cszn/KAIR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Video-Swin-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmediting)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/JingyunLiang/deb335792768ad9eb73854a8efca4fe0/vrt-demo-on-video-restoration.ipynb) | 15.06.2022 |
    • Shuai Yang - jiang.com/)</li> <li>[Ziwei Liu](https://liuziwei7.github.io/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00754)](https://doi.org/10.1109/CVPR52688.2022.00754) [![](https://img.shields.io/github/stars/williamyang1991/DualStyleGAN?style=social)](https://github.com/williamyang1991/DualStyleGAN) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13248)</li><li>[data](https://cs.nju.edu.cn/rl/WebCaricature.htm), [data](https://www.gwern.net/Crops#danbooru2019-portraits)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/lowfuel/progrock-stable), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TreB1eN/InsightFace_Pytorch)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/Gradio-Blocks/DualStyleGAN), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/hysts/DualStyleGAN)</li><li>[project](https://www.mmlab-ntu.com/project/dualstylegan/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/scZTu77jixI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/williamyang1991/DualStyleGAN/blob/master/notebooks/inference_playground.ipynb) | 24.03.2022 |
    • Roy Or-El - shechtman/)</li> <li>[Ira Kemelmacher-Shlizerman](https://www.irakemelmacher.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58539-6_44)](https://doi.org/10.1007/978-3-030-58539-6_44) [![](https://img.shields.io/github/stars/royorel/Lifespan_Age_Transformation_Synthesis?style=social)](https://github.com/royorel/Lifespan_Age_Transformation_Synthesis) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.09764)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/royorel/FFHQ-Aging-Dataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/pix2pixHD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/style-based-gan-pytorch)</li><li>[project](https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_jTFcjN2hBk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9fulnt2_q_Y)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/royorel/Lifespan_Age_Transformation_Synthesis/blob/master/LATS_demo.ipynb) | 31.10.2020 |
    • Yuval Alaluf - Or](https://danielcohenor.com/)</li></ul> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459805)](https://doi.org/10.1145/3450626.3459805) [![](https://img.shields.io/github/stars/yuval-alaluf/SAM?style=social)](https://github.com/yuval-alaluf/SAM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.02754)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/pixel2style2pixel), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch)</li><li>[project](https://yuval-alaluf.github.io/SAM/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X_pYC_LtBFw)</li></ul> | [![Open In Colab](images/colab.svg)](http://colab.research.google.com/github/yuval-alaluf/SAM/blob/master/notebooks/animation_inference_playground.ipynb) | 26.04.2021 |
    • Ian Simon - rnn)</li><li>[data](http://www.piano-e-competition.com/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/notebooks/magenta/performance_rnn/performance_rnn.ipynb) | 11.07.2017 |
  • Tutorials

    • Jonathan Godwin - bapst-73430a89)</li><details><summary>others</summary><li>[Thomas Kipf](https://tkipf.github.io/)</li> <li>[Yujia Li](https://yujiali.github.io/)</li> <li>[Kimberly Stachenfeld](https://neurokim.com/)</li> <li>[Petar Veličković](https://petar-v.com/)</li> <li>[Alvaro Sanchez-Gonzalez](https://github.com/alvarosg)</li></ul></details> | [![](https://img.shields.io/github/stars/google-deepmind/jraph?style=social)](https://github.com/google-deepmind/jraph) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1806.01261)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://jraph.readthedocs.io/en/latest/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/S3sRy4oqvCM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-deepmind/educational/blob/master/colabs/summer_schools/intro_to_graph_nets_tutorial_with_jraph.ipynb) | 14.04.2022 |
    • Dennis Ulmer - badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266) [![](https://img.shields.io/github/stars/Kaleidophon/deep-significance?style=social)](https://github.com/Kaleidophon/deep-significance) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2204.06815)</li><li>[blog post](https://machinelearningmastery.com/statistical-hypothesis-tests/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://deep-significance.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/replicability-analysis-NLP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/testSignificanceNLP), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rtmdrr/DeepComparison)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Multiple_comparisons_problem)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Kaleidophon/deep-significance/blob/main/paper/deep-significance%20demo.ipynb) | 12.04.2022 |
    • Silero team - models?style=social)](https://github.com/snakers4/silero-models) <ul><li>[STT](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/), [STT](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/), [STT](https://habr.com/ru/post/519562/)</li><li>[TTS](https://habr.com/ru/post/660571/), [TTS](https://habr.com/ru/post/549482/)</li><li>[Text Enhancement](https://habr.com/ru/post/581960/)</li><li>[VAD](https://thegradient.pub/one-voice-detector-to-rule-them-all/), [VAD](https://habr.com/ru/post/537276/)</li><li>[website](https://www.silero.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/snakers4/silero-models/blob/master/examples.ipynb) | 27.02.2022 |
    • Alexander Spirin
    • Bowen Shi - Ning Hsu](http://people.csail.mit.edu/wnhsu/)</li> <li>[Kushal Lakhotia](https://about.me/hikushalhere)</li> <li>[Abdelrahman Mohamed](http://www.cs.toronto.edu/~asamir/)</li></ul> | [![](https://img.shields.io/github/stars/facebookresearch/av_hubert?style=social)](https://github.com/facebookresearch/av_hubert) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.02184), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.01763), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1810.04805), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.04890)</li><li>[blog post](https://ai.facebook.com/blog/ai-that-understands-speech-by-looking-as-well-as-hearing/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1bNXkfpHiVHzXQH8WjGhzQ-fsDxolpUjD) | 12.02.2022 |
    • Jonathan Shen
    • Cameron Smith - style-tf?style=social)](https://github.com/cysmith/neural-style-tf) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1604.08610), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1606.05897), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1508.06576)</li><li>[cvpr](https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Gatys_Image_Style_Transfer_CVPR_2016_paper.pdf)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pastiche), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/The_Starry_Night), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/YUV), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Lab_color_space), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/YCbCr), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/CIELUV), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pareidolia)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14aJ7HQPbcP0sNRIY-FRO4u6lxtlyyxI_) | 01.10.2021 |
    • Chris Cummins - eth.github.io/)</li> <li>[Brandon Cui](https://www.linkedin.com/in/bcui19/)</li><details><summary>others</summary><li>[Jason Ansel](https://jasonansel.com/)</li> <li>[Sahir Gomez](https://github.com/sahirgomez1)</li> <li>[Olivier Teytaud](https://github.com/teytaud)</li> <li>[Benoit Steiner](http://bsteiner.info/)</li> <li>[Yuandong Tian](http://yuandong-tian.com/)</li> <li>[Hugh Leather](https://github.com/hughleat)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/CompilerGym?style=social)](https://github.com/facebookresearch/CompilerGym) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2109.08267)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/CompilerGym/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/CompilerGym/blob/development/examples/getting-started.ipynb) | 16.11.2021 |
    • Ben Wang - me/)</li> <li>[Janko Prester](https://www.jankoprester.com/)</li></ul> | [![](https://img.shields.io/github/stars/kingoflolz/mesh-transformer-jax?style=social)](https://github.com/kingoflolz/mesh-transformer-jax) <ul><li>[The Pile](https://pile.eleuther.ai/)</li><li>[blog post](https://arankomatsuzaki.wordpress.com/2021/06/04/gpt-j/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/EleutherAI/gpt-neox), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed)</li><li>[web demo](https://6b.eleuther.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kingoflolz/mesh-transformer-jax/blob/master/colab_demo.ipynb) | 15.09.2021 |
    • Anurag Pratik - Qian)</li> <li>[Yuxuan Sun](https://github.com/snyxan)</li> <li>[Ryan Drew](https://rdrew.dev/)</li> <li>[Sara Elkafrawy](https://github.com/saraEbrahim)</li> <li>[Anoushka Tiwari](https://www.linkedin.com/in/anoushka-tiwari)</li> <li>[Tucker Hart](https://www.linkedin.com/in/tucker-hart-05a638133)</li> <li>[Mary Williamson](https://scholar.google.com/citations?user=Ys4xB-QAAAAJ)</li> <li>[Abhinav Gupta](http://www.cs.cmu.edu/~abhinavg/)</li> <li>[Arthur Szlam](https://scholar.google.com/citations?user=u3-FxUgAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/droidlet?style=social)](https://github.com/facebookresearch/droidlet) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.10384), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.08584)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/droidlet/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/droidlet/blob/master/examples_and_tutorials/tutorials/droidlet_for_physical_robots.ipynb) | 15.09.2021 |
    • Ali Jahanian - design/gan_steerability?style=social)](https://github.com/ali-design/gan_steerability) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.07171), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1809.11096)</li><li>[project](https://ali-design.github.io/gan_steerability/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nS0V64sF7Cw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1kn6yG8PqD1U2bUcy32V1iAVjzlcQWcG3) | 04.03.2021 |
    • Rama Kumar
    • Chase Roberts
    • Romain Hennequin - spleeter-deezer-r-d-source-separation-engine-2b88985e797e)</li><li>[data](https://sigsep.github.io/datasets/musdb.html)</li><li>[project](https://research.deezer.com/projects/spleeter.html)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deezer/spleeter/blob/master/spleeter.ipynb) | 10.01.2021 |
    • Javier Gamazo - remover-partial-convolutions), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/zzh8829/yolov3-tf2)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=_dRjY9gMcxE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JDpH8MAjaKoekQ_H9ZaxYJ9_axiDtDGm) | 22.08.2020 |
    • pkulzc - serengeti)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/models/blob/master/research/object_detection/colab_tutorials/context_rcnn_tutorial.ipynb) | 17.06.2020 |
    • Bolei Zhou - segmentation-pytorch?style=social)](https://github.com/CSAILVision/semantic-segmentation-pytorch) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1608.05442), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1612.01105), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.10221), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.04514)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/CSAILVision/sceneparsing), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vacancy/Synchronized-BatchNorm-PyTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hszhao/semseg)</li><li>[project](http://sceneparsing.csail.mit.edu/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CSAILVision/semantic-segmentation-pytorch/blob/master/notebooks/DemoSegmenter.ipynb) | 21.08.2020 |
    • Dan Holtmann-Rice - config?style=social)](https://github.com/google/gin-config) <ul><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/stop-worrying-about-configs-with-gin-218562dd5c91)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/gin-config/blob/master/gin/gin_intro.ipynb) | 13.08.2020 |
    • Changhan Wang - 4IAAAAJ)</li> <li>[Jiatao Gu](http://jiataogu.me/)</li></ul> | [![](https://img.shields.io/github/stars/facebookresearch/covost?style=social)](https://github.com/facebookresearch/covost) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.01320), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/pdf/2007.10310), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.06670)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://www.changhan.me//SpeechTransProgress), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairseq/tree/main/examples/speech_to_text), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/vizseq)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/11GK7k7G1CG1qHbdA9Pz1RtQ3vlCkuohV) | 07.08.2020 |
    • kmindspark - shot-object-detection)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/models/blob/master/research/object_detection/colab_tutorials/eager_few_shot_od_training_tf2_colab.ipynb) | 11.07.2020 |
    • Andrey Ryabtsev - Matting?style=social)](https://github.com/senguptaumd/Background-Matting) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.00626)</li><li>[blog post](https://towardsdatascience.com/background-matting-the-world-is-your-green-screen-83a3c4f0f635)</li><li>[data](https://drive.google.com/open?id=1j3BMrRFhFpfzJAe6P2WDtfanoeSCLPiq)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/gist/andreyryabtsev/243aa3eefa6e06891dda7b1583d1d08f/backmatting.ipynb) | 18.05.2020 |
    • Qiusheng Wu - visualization/folium?style=social)](https://github.com/python-visualization/folium) <ul><li>[api](https://developers.google.com/earth-engine/python_install)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/giswqs/qgis-earthengine-examples/blob/master/Folium/ee-api-folium-setup.ipynb) | 20.01.2020 |
    • Tanuj Jain - tran.com/)</li></ul> | [![](https://img.shields.io/github/stars/idealo/imagededup?style=social)](https://github.com/idealo/imagededup) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1704.04861)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://fullstackml.com/wavelet-image-hash-in-python-3504fdd282b5)</li><li>[project](https://idealo.github.io/imagededup/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/idealo/imagededup/blob/master/examples/CIFAR10_duplicates.ipynb) | 03.10.2019 |
    • Lucas Persona - house-of-black-and-white/hall-of-faces?style=social)](https://github.com/the-house-of-black-and-white/hall-of-faces) <ul><li>[data](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/)</li><li>[yolo](https://pjreddie.com/darknet/yolo/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/the-house-of-black-and-white/hall-of-faces/blob/master/notebooks/Hall_of_Faces.ipynb) | 15.03.2018 |
    • ![Stargazers over time - colab-notebooks)
    • Shengyi Huang - ai/CORL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/Gymnasium), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/baselines), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ikostrikov/jaxrl)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cleanrl)</li><li>[paper](https://www.jmlr.org/papers/v23/21-1342.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCDdC6BIFRI0jvcwuhi3aI6w), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dm4HdGujpPs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vwxyzjn/cleanrl/blob/master/docs/get-started/CleanRL_Huggingface_Integration_Demo.ipynb) | 28.11.2023 |
    • Yuwei Guo - revolution/sd-webui-animatediff), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/talesofai/AnimateDiff), [<img src="images/git.svg" alt="git" height=20/>](https://youtu.be/-wki7IrQ_sU)</li><li>[project](https://animatediff.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rdnOhM8L8nE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LcHAZaJjA5k), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/66JgpI3a650?feature=share)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/AnimateDiff-colab/blob/main/AnimateDiff_colab.ipynb) | 30.10.2023 |
    • comfyanonymous
    • Lvmin Zhang - Q), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TJkrzuPdmvE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NfNwmKM3sxc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/lllyasviel/Fooocus/blob/main/colab.ipynb) | 03.10.2023 |
    • Mark Daoust - actor-critic-algorithms.pdf), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/1713-policy-gradient-methods-for-reinforcement-learning-with-function-approximation.pdf)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Temporal_difference_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/reinforcement_learning/actor_critic.ipynb) | 27.09.2023 |
    • MMAction2 Contributors - mmlab/mmaction2?style=social)](https://github.com/open-mmlab/mmaction2) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.13230), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2107.10161), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.17263), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.13586), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.05095), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.13042)</li><li>[data](https://sdolivia.github.io/FineGym/), [data](http://www.svcl.ucsd.edu/projects/resound/dataset.html), [data](https://research.google.com/ava/index.html), [data](https://www.deepmind.com/open-source/kinetics)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mmaction2.readthedocs.io/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmcv), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/SwinTransformer/Video-Swin-Transformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Cogito2012/DEAR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xvjiarui/VFS), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/holistic-video-understanding/HVU-Dataset)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmaction2/blob/master/demo/mmaction2_tutorial.ipynb) | 06.09.2023 |
    • Chris Paxton - robot?style=social)](https://github.com/facebookresearch/home-robot) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/cpaxton/contact_graspnet/tree/cpaxton/devel), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_body), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_firmware), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_ros), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_ros2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hello-robot/stretch_web_interface), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RoboStack/ros-noetic), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/codekansas/stretch-robot)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/home-robot/blob/master/src/home_robot_sim/notebooks/velocity_control_sim.ipynb) | 30.08.2023 |
    • Daniel Freeman
    • Samet Akcay - intel)</li> <li>[Utku Genc](https://github.com/ugenc-intel)</li></ul></details> | [![](https://img.shields.io/github/stars/openvinotoolkit/anomalib?style=social)](https://github.com/openvinotoolkit/anomalib) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2011.08785)</li><li>[data](https://www.mvtec.com/company/research/datasets/mvtec-ad)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://openvinotoolkit.github.io/anomalib/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/vnk8071/anomaly-detection-in-industry-manufacturing/tree/master/anomalib_contribute)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/getting-started-with-pytorch-image-models-timm-a-practitioners-guide-4e77b4bf9055)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/lib/timm)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/openvinotoolkit/anomalib/blob/main/notebooks/000_getting_started/001_getting_started.ipynb) | 15.05.2024 |
    • Lvmin Zhang - Light?style=social)](https://github.com/lllyasviel/IC-Light) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2312.06886), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2402.18848)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U_ZIkFb9P8w), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3EsJrdXGnpo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BuSsw8Nv1N4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/IC-Light-jupyter/blob/main/IC_Light_jupyter.ipynb) | 09.05.2024 |
    • Adam Stewart - us/research/people/jlavista/)</li> <li>[Arindam Banerjee](https://arindam.cs.illinois.edu/)</li></ul></details> | [![](https://img.shields.io/github/stars/microsoft/torchgeo?style=social)](https://github.com/microsoft/torchgeo) <ul><li>[NDBI](https://www.linkedin.com/pulse/ndvi-ndbi-ndwi-calculation-using-landsat-7-8-tek-bahadur-kshetri/)</li><li>[NDVI](https://gisgeography.com/ndvi-normalized-difference-vegetation-index/)</li><li>[NDWI](https://custom-scripts.sentinel-hub.com/custom-scripts/sentinel-2/ndwi/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.08872)</li><li>[data](https://docs.sentinel-hub.com/api/latest/data/sentinel-2-l2a/), [data](https://www.cogeo.org/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/davemlz/awesome-spectral-indices)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/torchgeo/blob/main/docs/tutorials/indices.ipynb) | 03.05.2024 |
    • Omry Yadan - omegaconf-a33be1b748ab)</li><li>[slides](https://docs.google.com/presentation/d/e/2PACX-1vT_UIV7hCnquIbLUm4NnkUpXvPEh33IKiUEvPRF850WKA8opOlZOszjKdZ3tPmf8u7hGNP6HpqS-NT5/pub)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/omry/omegaconf/blob/master/docs/notebook/Tutorial.ipynb) | 15.02.2024 |
    • microsoft - us/research/blog/autogen-enabling-next-generation-large-language-model-applications/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/pAbnFJrkgZ)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@multiplatform.ai/microsoft-autogen-transforming-ai-frameworks-for-enhanced-problem-solving-video-ac2655e7cdf)</li><li>[project](https://microsoft.github.io/autogen/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zdcCD--IieY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dCCr52uT0W8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JMpgsx74XDI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_RetrieveChat.ipynb) | 28.04.2024 |
    • Vittorio Caggiano
    • Jade Copet - kant-339a3b1b7)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Alexandre Défossez](https://ai.honu.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/audiocraft?style=social)](https://github.com/facebookresearch/audiocraft) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.05284), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.11325)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/camenduru/MusicGen-colab)</li><li>[<img src="images/huggingface.svg" alt="huggingface" height=20/>](https://huggingface.co/facebook/musicgen-large)</li><li>[project](https://ai.honu.io/papers/musicgen/)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/v-YpvPkhdO4), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://www.youtube.com/watch?v=EGfxuTy9Eeo), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/la2fGS0dW98), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/v-YpvPkhdO4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fxGqfg96RBUvGxZ1XXN07s3DthrKUl4-) | 11.06.2023 |
    • Yuxin Wu - detectron2-a-pytorch-based-modular-object-detection-library-/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://detectron2.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/matterport/Mask_RCNN/tree/master/samples/balloon)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) | 26.05.2023 |
    • Nikita Martynov - kozlova)</li> <li>[Katerina Kolomeytseva](https://www.linkedin.com/in/katerina-kolomeytseva-394a7a21a)</li><details><summary>others</summary><li>[Aleksandr Abramov](https://github.com/Ab1992ao)</li> <li>[Alena Fenogenova](https://github.com/Alenush)</li></ul></details> | [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2308.09435)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ai-forever/augmentex)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/RuM2M100-1.2B), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/FRED-T5-large-spell), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/RuM2M100-418M), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/ai-forever/T5-large-spell), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/ai-forever/spellcheck_benchmark)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Levenshtein_distance)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yFfkV0Qjuu0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/sage/blob/main/notebooks/text_correction_demo.ipynb) | 11.04.2024 |
    • Tincans - ai/gazelle?style=social)](https://github.com/tincans-ai/gazelle) <ul><li>[blog post](https://tincans.ai/slm)</li><li>[demo](https://demo.tincans.ai/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/qyC5h3FSzU)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://www.reddit.com/r/LocalLLaMA/comments/1cr84gb/joint_speechlanguage_model_respond_directly_to/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/tincans-ai/gazelle-v0.1), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/tincans-ai/gazelle-v0.2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/tincans-ai/gazelle-v0.2-dpo), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/wav2vec2-base-960h), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/meta-llama/Llama-2-7b-chat)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Spike_/(software_development/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tincans-ai/gazelle/blob/master/examples/infer_quantized.ipynb) | 20.03.2024 |
    • Jason Liu
    • Jake Vanderplas
    • Guangyao Zhou - dedieu)</li> <li>[Miguel Lázaro-Gredilla](https://www.tsc.uc3m.es/~miguel/)</li><details><summary>others</summary><li>[Shrinu Kushagra](https://cs.uwaterloo.ca/~skushagr/)</li> <li>[Dileep George](https://dileeplearning.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/PGMax?style=social)](https://github.com/deepmind/PGMax) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.04110)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Belief_propagation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/PGMax/blob/main/examples/rcn.ipynb) | 05.05.2023 |
    • Dongxu Li - wang)</li><details><summary>others</summary><li>[Silvio Savarese](https://scholar.google.com/citations?user=ImpbxLsAAAAJ)</li> <li>[Steven Hoi](https://sites.google.com/view/stevenhoi)</li></ul></details> | [![](https://img.shields.io/github/stars/salesforce/LAVIS?style=social)](https://github.com/salesforce/LAVIS) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.09019), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.06500), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.12597), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.10846), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2210.08773)</li><li>[blog post](https://blog.salesforceairesearch.com/lavis-language-vision-library/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://opensource.salesforce.com/LAVIS//latest/index.html)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Merlion)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.ipynb) | 24.03.2023 |
    • Michael Broughton - o9AhIz1uvo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/quantum/blob/master/docs/tutorials/hello_many_worlds.ipynb) | 17.05.2024 |
    • Jiawei Liu - ng)</li> <li>[Yinlin Deng](https://dengyinlin.github.io/)</li> <li>[Lingming Zhang](http://lingming.cs.illinois.edu/)</li></ul> | [![](https://img.shields.io/github/stars/ise-uiuc/tzer?style=social)](https://github.com/ise-uiuc/tzer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09947)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/repository/docker/tzerbot/oopsla)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://tzer.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/ganler/memcov)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ise-uiuc/tzer/blob/main/bug-report.ipynb) | 09.03.2023 |
    • Tom Hennigan - haiku?style=social)](https://github.com/deepmind/dm-haiku) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://dm-haiku.readthedocs.io/en/latest/)</li><li>[website](https://www.haiku-os.org/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm-haiku/blob/main/examples/haiku_lstms.ipynb) | 02.03.2023 |
    • Yuan Tang - _EId-D0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3bownM3L5zM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tensorboard/blob/master/docs/scalars_and_keras.ipynb) | 10.02.2023 |
    • Jason Roselander - engine/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/federated-learning)</li><li>[shell](https://cloud.google.com/shell/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/high_performance_simulation_with_kubernetes.ipynb) | 31.01.2023 |
    • Oleksii Kuchaiev
    • Nathan Raw - diffusion-videos?style=social)](https://github.com/nateraw/stable-diffusion-videos) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://gist.github.com/karpathy/00103b0037c5aaea32fe1da1af553355), [<img src="images/git.svg" alt="git" height=20/>](https://gist.github.com/nateraw/c989468b74c616ebbc6474aa8cdd9e53)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nateraw/stable-diffusion-videos/blob/main/stable_diffusion_videos.ipynb) | 07.05.2024 |
    • Craig Macdonald - badge.php?doi=10.1145/3459637.3482013)](https://doi.org/10.1145/3459637.3482013) [![](https://img.shields.io/github/stars/terrier-org/pyterrier?style=social)](https://github.com/terrier-org/pyterrier) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2007.14271)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pyterrier.readthedocs.io)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrier-org/ecir2021tutorial), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_ance), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_colbert), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_pisa), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_t5), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_doc2query), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/terrierteam/pyterrier_deepct)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/terrier-org/pyterrier/blob/master/examples/notebooks/non_en_retrieval.ipynb) | 02.11.2022 |
    • Alexander Kapitanov - theory?style=social)](https://github.com/hukenovs/dsp-theory) <ul><li>[blog post](https://habr.com/ru/articles/460445/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hukenovs/dsp-theory/blob/master/src/dsp_theory_1_signals.ipynb) | 18.10.2022 |
    • Eugene Yurtsev
    • Rishabh Agarwal - research/batch_rl?style=social)](https://github.com/google-research/batch_rl) <ul><li>[DQN](https://www.nature.com/articles/nature14236?wm=book_wap_0005)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.04543), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1709.06009)</li><li>[blog post](https://ai.googleblog.com/2020/04/an-optimistic-perspective-on-offline.html)</li><li>[data](https://console.cloud.google.com/storage/browser/atari-replay-datasets), [data](https://research.google/resources/datasets/dqn-replay/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/atari-py/tree/0.2.5/atari_py/atari_roms), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mgbellemare/Arcade-Learning-Environment), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mila-iqia/SGI/blob/master/src/offline_dataset.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kzl/decision-transformer/tree/master/atari)</li><li>[project](https://offline-rl.github.io/)</li><li>[slides](https://docs.google.com/presentation/d/1ROltXr6FIeYKrnGl0tKHGWI0pL4Zo8CnvAK2-cdpQyY)</li><li>[talk](https://slideslive.com/38928373/an-optimistic-perspective-on-offline-deep-reinforcement-learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/install/install_linux)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1ktlNni_vwFpFtCgUez-RHW0OdGc2U_Wv) | 04.10.2022 |
    • Matt Hoffman - Maron](https://github.com/fastturtle)</li><details><summary>others</summary><li>[Feryal Behbahani](https://feryal.github.io/)</li> <li>[Tamara Norman](https://github.com/tamaranorman)</li> <li>[Abbas Abdolmaleki](https://scholar.google.com/citations?user=cCYTVWQAAAAJ)</li> <li>[Albin Cassirer](https://github.com/acassirer)</li> <li>[Fan Yang](https://github.com/ddmbr)</li> <li>[Kate Baumli](https://github.com/katebaumli)</li> <li>[Sarah Henderson](https://www.linkedin.com/in/sarah-henderson-agilecoach/)</li> <li>[Alex Novikov](https://scholar.google.ru/citations?user=jMUkLqwAAAAJ)</li> <li>[Sergio Gómez Colmenarejo](https://scholar.google.ru/citations?user=0Dkf68EAAAAJ)</li> <li>[Serkan Cabi](https://scholar.google.ru/citations?&user=l-HhJaUAAAAJ)</li> <li>[Caglar Gulcehre](https://www.caglarg.com/)</li> <li>[Tom Le Paine](http://tomlepaine.github.io/)</li> <li>[Andrew Cowie](https://scholar.google.ru/citations?&user=aTvi5mUAAAAJ)</li> <li>[Ziyu Wang](https://ziyuw.github.io/)</li> <li>[Bilal Piot](https://scholar.google.ru/citations?&user=fqxNUREAAAAJ)</li> <li>[Nando de Freitas](https://github.com/nandodf)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/acme?style=social)](https://github.com/deepmind/acme) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.00979)</li><li>[blog post](https://www.deepmind.com/publications/acme-a-new-framework-for-distributed-reinforcement-learning)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://dm-acme.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/dm_env)</li><li>[<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/NUwDr42bPOw), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/J1XCWjuyRaI), [<img src="images/youtube.svg" alt="youtube" height=20/>](https://youtu.be/pFMuQWpHI5k)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/acme/blob/master/examples/tutorial.ipynb) | 26.09.2022 |
    • Vighnesh Birodkar - mac)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/models/blob/master/research/object_detection/colab_tutorials/deepmac_colab.ipynb) | 09.08.2022 |
    • Aadesh Gupta - r)</li> <li>[Ashish Shrivastava](https://github.com/ashish3586)</li> <li>[Nagender Aneja](https://researchid.co/naneja)</li> <li>[Zijie Wang](https://zijie.wang/)</li> <li>[Yiwen Shi](https://github.com/Yiwen-Shi)</li> <li>[Afnan Mir](https://github.com/afnanmmir)</li> <li>[William Soto](https://github.com/sotwi)</li> <li>[Chandan Singh](https://csinva.io/)</li> <li>[Claude Roux](https://github.com/ClaudeRoux)</li> <li>[Abinaya Mahendiran](https://github.com/AbinayaM02)</li> <li>[Anna Shvets](https://github.com/asnota)</li> <li>[Kaustubh Dhole](https://github.com/kaustubhdhole)</li> <li>[Bryan Wilie](https://github.com/bryanwilie)</li> <li>[Jamie Simon](https://james-simon.github.io/)</li> <li>[Mukund Varma](https://github.com/MukundVarmaT)</li> <li>[Sang Han](https://github.com/jjangsangy)</li> <li>[Denis Kleyko](https://github.com/denkle)</li> <li>[Samuel Cahyawijaya](https://github.com/SamuelCahyawijaya)</li> <li>[Filip Cornell](https://github.com/Filco306)</li> <li>[Tanay Dixit](https://tanay2001.github.io/)</li> <li>[Connor Boyle](https://github.com/boyleconnor)</li> <li>[Genta Indra Winata](https://gentawinata.com/)</li> <li>[Seungjae Ryan Lee](https://github.com/seungjaeryanlee)</li> <li>[Marcin Namysl](https://github.com/mnamysl)</li> <li>[Roman Sitelew](https://github.com/RomanPlusPlus)</li> <li>[Zhenhao Li](https://zhenhaoli.net/)</li> <li>[Fiona Tan](https://tanfiona.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/GEM-benchmark/NL-Augmenter?style=social)](https://github.com/GEM-benchmark/NL-Augmenter) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.02721)</li><li>[website](https://gem-benchmark.com/nl_augmenter)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/GEM-benchmark/NL-Augmenter/blob/main/notebooks/Write_a_sample_transformation.ipynb) | 06.08.2022 |
    • Jacob Solawetz - to-train-yolov5-on-a-custom-dataset/)</li><li>[data](https://public.roboflow.ai/object-detection/bccd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1gDZ2xcTOgR39tGGs-EZ6i3RTs16wmzZQ) | 20.07.2022 |
    • multimodal.art - diffusion)</li><li>[project](https://multimodal.art/mindseye)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1cg0LZ5OfN9LAIB37Xq49as0fSJxcKtC5) | 06.07.2022 |
    • Daniil Chesakov - kuznetsov-70ab12127)</li> <li>[Denis Dimitrov](https://github.com/denndimitrov)</li></ul> | [![](https://img.shields.io/github/stars/ai-forever/sber-swap?style=social)](https://github.com/ai-forever/sber-swap) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.03046), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1912.13457), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1901.08971), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.06340), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05005), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.09965)</li><li>[blog post](https://habr.com/ru/company/sberbank/blog/645919/)</li><li>[data](https://www.robots.ox.ac.uk/~vgg/data/vgg_face/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/14wnxMvD9zsiBQo2FtTpxn6w2cpXCcb-7) | 29.06.2022 |
    • Jaehoon Lee - Dickstein](http://www.sohldickstein.com/)</li> <li>[Vinay Ramasesh](https://ramasesh.github.io/)</li> <li>[Sajant Anand](https://github.com/sajantanand)</li><details><summary>others</summary><li>[Alicia Parrish](https://aliciaparrish.com/)</li> <li>[Ethan Dyer](https://github.com/ethansdyer)</li> <li>[Liam Dugan](http://liamdugan.com/)</li> <li>[Dieuwke Hupkes](https://github.com/dieuwkehupkes)</li> <li>[Daniel Freeman](https://github.com/cdfreeman-google)</li> <li>[Guy Gur-Ari](https://github.com/guygurari)</li> <li>[Aitor Lewkowycz](https://github.com/lewkowycz)</li></ul></details> | [![](https://img.shields.io/github/stars/google/BIG-bench?style=social)](https://github.com/google/BIG-bench) <ul><li>[API](https://google.github.io/BIG-bench/docs/html/bigbench/index.html)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2206.04615)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/seqio)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/BIG-bench/blob/master/notebooks/colab_examples.ipynb) | 27.06.2022 |
    • Aleksey Korshuk - demo.ipynb) | 25.06.2022 |
    • Balint Pato
    • Jacob Kahn - gAAAAJ)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Ronan Collobert](https://ronan.collobert.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/flashlight/flashlight?style=social)](https://github.com/flashlight/flashlight) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12465)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/flml/flashlight/tags?page=1&ordering=last_updated&name=cuda-latest)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://fl.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/arrayfire/arrayfire), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/vcpkg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/arrayfire/arrayfire-ml/), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nvidia/cub), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/USCiLab/cereal), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nothings/stb), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookincubator/gloo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/oneapi-src/oneDNN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/glog), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/gflags/gflags), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/flashlight/text)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/flashlight/flashlight/blob/master/flashlight/app/asr/tutorial/notebooks/FinetuneCTC.ipynb) | 01.06.2022 |
    • Kevin Frans - exploring-text-to-drawing-synthesis/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/BachiLi/diffvg/blob/master/apps/painterly_rendering.py)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kvfrans/clipdraw/blob/main/clipdraw.ipynb) | 28.04.2022 |
    • Wilson Yan
    • Xingchao Liu - Hi8STUrLc2m4DeOviv7NO) | 02.01.2022 |
    • Alex Shonenkov - forever/ru-dalle?style=social)](https://github.com/ai-forever/ru-dalle) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/bes-dev/vqvae_dwt_distiller.pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/boomb0om/Real-ESRGAN-colab)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/multimodalart/rudalle)</li><li>[project](https://rudalle.ru/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/ru-dalle/blob/master/jupyters/ruDALLE-example-generation-A100.ipynb) | 03.11.2021 |
    • EleutherAI - neo?style=social)](https://github.com/EleutherAI/gpt-neo) <ul><li>[GPT-2](https://openai.com/blog/better-language-models/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.14165), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.05150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1701.06538)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/tensorflow/mesh), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/EleutherAI/gpt-neox/)</li><li>[pretrained](https://the-eye.eu/public/AI/gptneo-release/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/EleutherAI/GPTNeo/blob/master/GPTNeo_example_notebook.ipynb) | 28.03.2021 |
    • Billy Lamberta - Or](https://danielcohenor.com/)</li> <li>[Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)</li> <li>[Adam Roberts](https://github.com/adarob)</li> <li>[Jesse Engel](https://github.com/jesseengel)</li> <li>[Google](https://www.tensorflow.org/)</li> <li>[Curtis Hawthorne](https://github.com/cghawthorne)</li> <li>[Eli Shechtman](https://research.adobe.com/person/eli-shechtman/)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li> <li>[Yuval Alaluf](https://yuval-alaluf.github.io/)</li> <li>[Patrick Esser](https://github.com/pesser)</li> <li>[Robin Rombach](https://github.com/rromb)</li> <li>[Or Patashnik](https://orpatashnik.github.io/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li> <li>[Bolei Zhou](https://boleizhou.github.io/)</li> <li>[Krzysztof Ostrowski](https://github.com/krzys-ostrowski)</li> <li>[Max Woolf](https://minimaxir.com/)</li> <li>[Jon Barron](https://jonbarron.info/)</li> <li>[Xiaohua Zhai](https://github.com/xiaohuazhai)</li> <li>[Ishan Misra](https://imisra.github.io/)</li> <li>[Nikhila Ravi](https://nikhilaravi.com/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li> <li>[Gabriel Synnaeve](https://syhw.github.io/)</li> <li>[Jia-Bin Huang](https://jbhuang0604.github.io/)</li> <li>[Karen Simonyan](https://scholar.google.com/citations?user=L7lMQkQAAAAJ)</li> <li>[Amit Bermano](https://www.cs.tau.ac.il/~amberman/)</li> <li>[Lei Zhang](https://www.leizhang.org/)</li> <li>[Jun-Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Michael Black](https://ps.is.mpg.de/~black)</li> <li>[Yong Zhang](https://yzhang2016.github.io/)</li> <li>[Phil Wang](https://lucidrains.github.io/)</li> <li>[Ben Trevett](https://bentrevett.com/)</li></ul> | <ul><li>tensorflow/models [![](https://img.shields.io/github/stars/tensorflow/models?style=social)](https://github.com/tensorflow/models)</li> <li>CompVis/stable-diffusion [![](https://img.shields.io/github/stars/CompVis/stable-diffusion?style=social)](https://github.com/CompVis/stable-diffusion)</li> <li>openai/whisper [![](https://img.shields.io/github/stars/openai/whisper?style=social)](https://github.com/openai/whisper)</li> <li>CorentinJ/Real-Time-Voice-Cloning [![](https://img.shields.io/github/stars/CorentinJ/Real-Time-Voice-Cloning?style=social)](https://github.com/CorentinJ/Real-Time-Voice-Cloning)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>ultralytics/yolov5 [![](https://img.shields.io/github/stars/ultralytics/yolov5?style=social)](https://github.com/ultralytics/yolov5)</li> <li>iperov/DeepFaceLab [![](https://img.shields.io/github/stars/iperov/DeepFaceLab?style=social)](https://github.com/iperov/DeepFaceLab)</li> <li>facebookresearch/segment-anything [![](https://img.shields.io/github/stars/facebookresearch/segment-anything?style=social)](https://github.com/facebookresearch/segment-anything)</li> <li>jakevdp/PythonDataScienceHandbook [![](https://img.shields.io/github/stars/jakevdp/PythonDataScienceHandbook?style=social)](https://github.com/jakevdp/PythonDataScienceHandbook)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>lllyasviel/Fooocus [![](https://img.shields.io/github/stars/lllyasviel/Fooocus?style=social)](https://github.com/lllyasviel/Fooocus)</li> <li>Stability-AI/stablediffusion [![](https://img.shields.io/github/stars/Stability-AI/stablediffusion?style=social)](https://github.com/Stability-AI/stablediffusion)</li> <li>LAION-AI/Open-Assistant [![](https://img.shields.io/github/stars/LAION-AI/Open-Assistant?style=social)](https://github.com/LAION-AI/Open-Assistant)</li> <li>XingangPan/DragGAN [![](https://img.shields.io/github/stars/XingangPan/DragGAN?style=social)](https://github.com/XingangPan/DragGAN)</li> <li>TencentARC/GFPGAN [![](https://img.shields.io/github/stars/TencentARC/GFPGAN?style=social)](https://github.com/TencentARC/GFPGAN)</li> <li>microsoft/visual-chatgpt [![](https://img.shields.io/github/stars/microsoft/visual-chatgpt?style=social)](https://github.com/microsoft/visual-chatgpt)</li> <li>suno-ai/bark [![](https://img.shields.io/github/stars/suno-ai/bark?style=social)](https://github.com/suno-ai/bark)</li> <li>google-research/google-research [![](https://img.shields.io/github/stars/google-research/google-research?style=social)](https://github.com/google-research/google-research)</li> <li>ray-project/ray [![](https://img.shields.io/github/stars/ray-project/ray?style=social)](https://github.com/ray-project/ray)</li> <li>coqui-ai/TTS [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS)</li> <li>facebookresearch/fairseq [![](https://img.shields.io/github/stars/facebookresearch/fairseq?style=social)](https://github.com/facebookresearch/fairseq)</li> <li>facebookresearch/detectron2 [![](https://img.shields.io/github/stars/facebookresearch/detectron2?style=social)](https://github.com/facebookresearch/detectron2)</li> <li>google/jax [![](https://img.shields.io/github/stars/google/jax?style=social)](https://github.com/google/jax)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>xinntao/Real-ESRGAN [![](https://img.shields.io/github/stars/xinntao/Real-ESRGAN?style=social)](https://github.com/xinntao/Real-ESRGAN)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>deezer/spleeter [![](https://img.shields.io/github/stars/deezer/spleeter?style=social)](https://github.com/deezer/spleeter)</li> <li>svc-develop-team/so-vits-svc [![](https://img.shields.io/github/stars/svc-develop-team/so-vits-svc?style=social)](https://github.com/svc-develop-team/so-vits-svc)</li> <li>huggingface/diffusers [![](https://img.shields.io/github/stars/huggingface/diffusers?style=social)](https://github.com/huggingface/diffusers)</li> <li>openai/CLIP [![](https://img.shields.io/github/stars/openai/CLIP?style=social)](https://github.com/openai/CLIP)</li> <li>openai/gpt-2 [![](https://img.shields.io/github/stars/openai/gpt-2?style=social)](https://github.com/openai/gpt-2)</li> <li>AlexeyAB/darknet [![](https://img.shields.io/github/stars/AlexeyAB/darknet?style=social)](https://github.com/AlexeyAB/darknet)</li> <li>RVC-Project/Retrieval-based-Voice-Conversion-WebUI [![](https://img.shields.io/github/stars/RVC-Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI)</li> <li>pyg-team/pytorch_geometric [![](https://img.shields.io/github/stars/pyg-team/pytorch_geometric?style=social)](https://github.com/pyg-team/pytorch_geometric)</li> <li>jina-ai/jina [![](https://img.shields.io/github/stars/jina-ai/jina?style=social)](https://github.com/jina-ai/jina)</li></ul> | <ul><li>AlphaFold [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2)</li> <li>MoCo [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00975)](https://doi.org/10.1109/CVPR42600.2020.00975)</li> <li>EfficientDet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.01079)](https://doi.org/10.1109/CVPR42600.2020.01079)</li> <li>DeepLabCut [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41593-018-0209-y)](https://doi.org/10.1038/s41593-018-0209-y)</li> <li>StyleGAN 2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00813)](https://doi.org/10.1109/CVPR42600.2020.00813)</li> <li>Fine-tuning a BERT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/N19-1423)](https://doi.org/10.18653/v1/N19-1423)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>SwinIR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCVW54120.2021.00210)](https://doi.org/10.1109/ICCVW54120.2021.00210)</li> <li>HMR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR.2018.00744)](https://doi.org/10.1109/CVPR.2018.00744)</li> <li>Instant-NGP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530127)](https://doi.org/10.1145/3528223.3530127)</li> <li>Neural Style Transfer [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1167/16.12.326)](https://doi.org/10.1167/16.12.326)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li> <li>PIFu [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV.2019.00239)](https://doi.org/10.1109/ICCV.2019.00239)</li> <li>SPIN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV.2019.00234)](https://doi.org/10.1109/ICCV.2019.00234)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>Pixel2Style2Pixel [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00232)](https://doi.org/10.1109/CVPR46437.2021.00232)</li> <li>VIBE [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00530)](https://doi.org/10.1109/CVPR42600.2020.00530)</li> <li>InterFaceGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00926)](https://doi.org/10.1109/CVPR42600.2020.00926)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>Real-ESRGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCVW54120.2021.00217)](https://doi.org/10.1109/ICCVW54120.2021.00217)</li> <li>PIFuHD [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00016)](https://doi.org/10.1109/CVPR42600.2020.00016)</li> <li>Nerfies [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00581)](https://doi.org/10.1109/ICCV48922.2021.00581)</li> <li>Skillful Precipitation Nowcasting Using Deep Generative Models of Radar [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03854-z)](https://doi.org/10.1038/s41586-021-03854-z)</li> <li>encoder4editing [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459838)](https://doi.org/10.1145/3450626.3459838)</li> <li>BiT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58558-7_29)](https://doi.org/10.1007/978-3-030-58558-7_29)</li> <li>Parallel WaveGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP40776.2020.9053795)](https://doi.org/10.1109/ICASSP40776.2020.9053795)</li> <li>Wav2Lip [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3394171.3413532)](https://doi.org/10.1145/3394171.3413532)</li> <li>LaMa [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/WACV51458.2022.00323)](https://doi.org/10.1109/WACV51458.2022.00323)</li> <li>SeFa [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00158)](https://doi.org/10.1109/CVPR46437.2021.00158)</li> <li>Cleanlab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1613/jair.1.12125)](https://doi.org/10.1613/jair.1.12125)</li> <li>CartoonGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR.2018.00986)](https://doi.org/10.1109/CVPR.2018.00986)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>GFPGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00905)](https://doi.org/10.1109/CVPR46437.2021.00905)</li> <li>StyleGAN-NADA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530164)](https://doi.org/10.1145/3528223.3530164)</li></ul> |
    • Nils Reimers - NLP/Opus-MT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairseq/tree/main/examples/multilingual)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1X47vgSiOphpxS5w_LPtjQgJmiSTNfRNC) | 26.04.2021 |
    • Stability AI - AI/StableLM?style=social)](https://github.com/Stability-AI/StableLM) <ul><li>[blog post](https://stability.ai/blog/stability-ai-launches-the-first-of-its-stablelm-suite-of-language-models)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/llama), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tatsu-lab/stanford_alpaca), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nomic-ai/gpt4all), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/databrickslabs/dolly), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/anthropics/hh-rlhf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/lmsys/vicuna-13b-delta-v0), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/RyokoAI/ShareGPT52K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dypPSs4t77g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nWf1StvtoRw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Hg-s2RTaTFE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qXtJjoEfTnA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stability-AI/StableLM/blob/main/notebooks/stablelm-alpha.ipynb) | 27.04.2023 |
    • Edgar Riba - badge.php?doi=10.1109/WACV45572.2020.9093363)](https://doi.org/10.1109/WACV45572.2020.9093363) [![](https://img.shields.io/github/stars/kornia/kornia?style=social)](https://github.com/kornia/kornia) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.02190)</li><li>[blog post](https://opencv.org/kornia-an-open-source-differentiable-computer-vision-library-for-pytorch/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://kornia.readthedocs.io/en/latest/)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://join.slack.com/t/kornia/shared_invite/zt-csobk21g-2AQRi~X9Uu6PLMuUZdvfjA)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/kornia_foss)</li><li>[website](https://kornia.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCI1SE1Ij2Fast5BSKxoa7Ag), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3RmCYFhwclE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AAZa-mXjYF0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/kornia/kornia/blob/master/examples/augmentation/kornia_augmentation.ipynb) | 02.07.2024 |
    • w-okada - okada/voice-changer?style=social)](https://github.com/w-okada/voice-changer) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/yxlllc/DDSP-SVC)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/wok000/vcclient000)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/POo_Cg0eFMU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/fba9Zhsukqw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/s_GirFEGvaA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Q7bbEC4aeKM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_JXbvSTGPoo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/pHhjg2JwdPI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/We5oYpCR3WQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/aVfoC1EHlVs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YF1lBaqeyt8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hinabl/voice-changer-colab/blob/master/Hina_Modified_Realtime_Voice_Changer_on_Colab.ipynb) | 01.07.2024 |
    • EnzymeZoo - art/deforum-stable-diffusion?style=social)](https://github.com/deforum-art/deforum-stable-diffusion) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/deforum)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.google.com/document/d/1RrQv7FntzOuLg4ohjRZPVL7iptIyBhwwbcEYEW2OfcI)</li><li>[project](https://deforum.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w_sxuDMt_V0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bicPayZDI60), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dqkQo2alZvU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deforum-art/deforum-stable-diffusion/blob/main/Deforum_Stable_Diffusion.ipynb) | 29.06.2024 |
    • Zachary Charles - analytics-collaborative-data.html)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/federated-learning)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/learning/Model)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/building_your_own_federated_learning_algorithm.ipynb) | 28.06.2024 |
    • CAMB.AI - ai/MARS5-TTS?style=social)](https://github.com/Camb-ai/MARS5-TTS) <ul><li>[demo](https://6b1a3a8e53ae.ngrok.app/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/FFQNCSKSXX)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/cambai/mars5ttsimage)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.camb.ai/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/RF5/transfusion-asr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ehoogeboom/multinomial_diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/karpathy/minbpe)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/CAMB-AI/MARS5-TTS)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bmJSLPYrKtE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Camb-ai/mars5-tts/blob/master/mars5_demo.ipynb) | 25.06.2024 |
    • Thomas Simonini - rl-class?style=social)](https://github.com/huggingface/deep-rl-class) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/alex-petrenko/sample-factory)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/deep-rl-course/unit0/introduction), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/huggingface-projects/Deep-Reinforcement-Learning-Leaderboard)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html)</li><li>[syllabus](https://simoninithomas.github.io/deep-rl-course)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2GwBez0D20A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CsuIANBnSq8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/AQKAOXJa6qg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/deep-rl-class/blob/main/notebooks/unit1/unit1.ipynb) | 24.06.2024 |
    • Artiprocher - Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2401.16224)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Helsinki-NLP/opus-mt-en-zh), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/alibaba-pai/pai-bloom-1b1-text2prompt-sd)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Artiprocher/DiffSynth-Studio/blob/main/examples/Diffutoon.ipynb) | 06.06.2024 |
    • Saran Tunyasuvunakool - tom)</li> <li>[Timothy Lillicrap](https://contrastiveconvergence.net/~timothylillicrap/index.php)</li> <li>[Nicolas Heess](https://scholar.google.com/citations?user=79k7bGEAAAAJ)</li> <li>[Yuval Tassa](https://github.com/yuvaltassa)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/dm_control?style=social)](https://github.com/deepmind/dm_control) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.12983), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.00690), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1902.07151), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1707.02286), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.09564), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.10567)</li><li>[blog post](https://www.deepmind.com/publications/dm-control-software-and-tasks-for-continuous-control)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Tippe_top)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CMjoiU482Jk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rAai4QzcYbs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/WhaRsrlaXLk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm_control/blob/master/tutorial.ipynb) | 04.06.2024 |
    • Emo Todorov - tom)</li> <li>[Yuval Tassa](https://github.com/yuvaltassa)</li></ul> | [![](https://img.shields.io/github/stars/deepmind/mujoco?style=social)](https://github.com/deepmind/mujoco) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.12983)</li><li>[blog post](https://www.deepmind.com/blog/opening-up-a-physics-simulator-for-robotics), [blog post](https://www.deepmind.com/blog/open-sourcing-mujoco)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mujoco.readthedocs.io/en/latest/overview.html)</li><li>[website](https://mujoco.org/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Tippe_top), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Chaos_theory), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/3D_projection#Mathematical_formula)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0ORsj_E17B0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yHZVVfsJ8mc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/eyzzsGJ1iic)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/dm_control/blob/master/dm_control/mujoco/tutorial.ipynb) | 04.06.2024 |
    • Nishant Aklecha - from-scratch?style=social)](https://github.com/naklecha/llama3-from-scratch) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/karpathy/minbpe)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/naklecha), [<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/aaaaaaaaaaorg)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/o29P0Kpobz0?t=530)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/naklecha/llama3-from-scratch/blob/main/llama3-from-scratch.ipynb) | 19.05.2024 |
    • Oleksandr Ferludin - Gonzalez](https://github.com/alvarosg)</li> <li>[Wai Lok Sibon Li](https://scholar.google.com/citations?user=qX9aUx8AAAAJ)</li> <li>[Sami Abu-El-Haija](https://samihaija.github.io/)</li> <li>[Peter Battaglia](https://scholar.google.com/citations?user=nQ7Ij30AAAAJ)</li> <li>[Neslihan Bulut](https://scholar.google.com/citations?user=k_cadGsAAAAJ)</li> <li>[Jonathan Halcrow](https://scholar.google.com/citations?user=2zZucy4AAAAJ)</li> <li>[Filipe Miguel Gonçalves de Almeida](https://github.com/fmgda)</li> <li>[Pedro Gonnet](https://research.google/people/pedro-gonnet/)</li> <li>[Liangze Jiang](https://liangzejiang.github.io/)</li> <li>[Parth Kothari](https://thedebugger811.github.io/)</li> <li>[Silvio Lattanzi](https://sites.google.com/site/silviolattanzi/)</li> <li>[André Linhares](https://scholar.google.com/citations?user=YYRnhTkAAAAJ)</li> <li>[Brandon Mayer](https://github.com/brandonmayer-zz)</li> <li>[Vahab Mirrokni](https://people.csail.mit.edu/mirrokni/Welcome.html)</li> <li>[John Palowitch](http://ml.johnpalowitch.com/)</li> <li>[Mihir Paradkar](https://www.linkedin.com/in/mihir-paradkar-22b88579)</li> <li>[Jennifer She](https://scholar.google.com/citations?user=Gjf_sd0AAAAJ)</li> <li>[Anton Tsitsulin](https://tsitsul.in/)</li> <li>[Kevin Villela](https://www.linkedin.com/in/kevin-villela-612a6443)</li> <li>[Lisa Wang](https://scholar.google.com/citations?user=5KmYPkIAAAAJ)</li> <li>[Bryan Perozzi](http://www.perozzi.net/)</li></ul></details> | [![](https://img.shields.io/github/stars/tensorflow/gnn?style=social)](https://github.com/tensorflow/gnn) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.03522)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/fidels/introduction-to-tf-gnn)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@techtes.com/getting-started-with-tf-gnn-with-python-26d8e341db05)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://blog.tensorflow.org/2024/02/graph-neural-networks-in-tensorflow.html), [<img src="images/tf.svg" alt="tf" height=20/>](https://blog.tensorflow.org/2021/11/introducing-tensorflow-gnn.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL2PZTwLd0HMJC1fU_NkwwpRkcjoGqAECX), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/JqWROPYeqjA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YdGN-J322y4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VDzrvhgyxsU), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/e6WHg1l7AMs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/a75Q6dtg1_s)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/gnn/blob/master/examples/notebooks/graph_network_shortest_path.ipynb) | 24.04.2024 |
    • Shenghai Yuan - shi.github.io/)</li> <li>[Yongqi Xu](https://cheliosoops.github.io/YongqiXu.io/)</li><details><summary>others</summary><li>[Ruijie Zhu](https://ruijie-zhu.github.io/)</li> <li>[Bin Lin](https://github.com/LinB203)</li> <li>[Xinhua Cheng](https://cxh0519.github.io/)</li> <li>[Li Yuan](https://yuanli2333.github.io/)</li> <li>[Jiebo Luo](https://www.cs.rochester.edu/u/jluo/)</li></ul></details> | [![](https://img.shields.io/github/stars/PKU-YuanGroup/MagicTime?style=social)](https://github.com/PKU-YuanGroup/MagicTime) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2404.05014), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2406.18522)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/PKU-YuanGroup/ChronoMagic-Bench), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/kijai/ComfyUI-MagicTimeWrapper), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xuduo35/MakeLongVideo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Vchitect/LaVie), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Vchitect/Latte)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/BestWishYsh/MagicTime?logs=build), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/BestWishYsh/ChronoMagic), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cerspense/zeroscope_v2_576w)</li><li>[project](https://pku-yuangroup.github.io/MagicTime/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1c1rv7q/magictime_demo_timelapse_video_generation_models/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/_akhaliq/status/1777538468043792473), [<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/vhjf36495872/status/1777525817087553827?s=61&t=r2HzCsU2AnJKbR8yKSprKw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/MagicTime-jupyter/blob/main/MagicTime_jupyter.ipynb) | 14.04.2024 |
    • Elena Samuylova - dral)</li> <li>[Olga Filippova](https://github.com/0lgaF)</li></ul> | [![](https://img.shields.io/github/stars/evidentlyai/evidently?style=social)](https://github.com/evidentlyai/evidently) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.evidentlyai.com/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/0lgaF/my_tab_with_evidently)</li><li>[website](https://evidentlyai.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/EvidentlyAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/L4Pv6ExBQPM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/evidentlyai/evidently/blob/main/examples/sample_notebooks/getting_started_tutorial.ipynb) | 15.03.2024 |
    • Alon Ziv - Diffusion/blob/main/Tutorials/AI-Music-Generation-Audiocraft-Tutorial.md#more-info-about-top-k-top-p-temperature-and-classifier-free-guidance-from-chatgpt)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/magnet-medium-10secs), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/magnet-medium-30secs), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/audio-magnet-medium)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://generativeai.pub/metas-ai-magnet-the-next-big-thing-in-text-to-audio-technology-7d524d9459ef)</li><li>[project](https://pages.cs.huji.ac.il/adiyoss-lab/MAGNeT/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/ArtificialInteligence/comments/19808gf/magnet_masked_audio_generation_using_a_single/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/MAGNeT-colab/blob/main/MAGNET_colab.ipynb) | 16.01.2024 |
    • RVC-Project - Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/HcsmBBGyVk)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/auspicious3000/contentvec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/jik876/hifi-gan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/FFmpeg/FFmpeg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Anjok07/ultimatevocalremovergui), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/audio-slicer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dream-High/RMVPE)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/lj1995/VoiceConversionWebUI)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@ja.harr91/decoding-the-sound-of-virality-a-deep-dive-into-adversarial-ai-for-voice-conversion-tasks-on-m1-d60d32cfb2d4)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-JcvdDErkAU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9TroP5mR3CM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Y8IxVVQBEpc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qZ12-Vm2ryc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5i_Pyw0gH-M)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/RVC-Project/Retrieval-based-Voice-Conversion-WebUI/blob/main/Retrieval_based_Voice_Conversion_WebUI.ipynb) | 11.01.2024 |
    • Conor Heins - tschantz)</li> <li>[Beren Millidge](https://www.beren.io/)</li> <li>[Brennan Klein](https://github.com/jkbren)</li><details><summary>others</summary><li>[Arun Niranjan](https://github.com/Arun-Niranjan)</li> <li>[Daphne Demekas](https://github.com/daphnedemekas)</li></ul></details> | [![](https://img.shields.io/github/stars/infer-actively/pymdp?style=social)](https://github.com/infer-actively/pymdp) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.03904)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pymdp-rtd.readthedocs.io/en/stable/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/infer-actively/pymdp/blob/master/docs/notebooks/active_inference_from_scratch.ipynb) | 19.03.2023 |
    • Vijish Madhavan - Me?style=social)](https://github.com/vijishmadhavan/Toon-Me) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10196), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1707.02921), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1603.08155)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/vijishmadhavan/Light-Up/blob/master/Toon_Me_(Try_it_on_Colab).ipynb) | 22.01.2021 |
    • Matthew Tancik - austin)</li> <li>[Kamyar Salahi](https://github.com/TheQuantumFractal)</li> <li>[Abhik Ahuja](https://abhikahuja.com/)</li> <li>[David McAllister](https://github.com/mcallisterdavid)</li> <li>[Angjoo Kanazawa](https://github.com/akanazawa)</li></ul></details> | [![](https://img.shields.io/github/stars/nerfstudio-project/nerfstudio?style=social)](https://github.com/nerfstudio-project/nerfstudio) <ul><li>[Viewer](https://viewer.nerf.studio/)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.04264)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/uMbNqcraFc)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.nerf.studio/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/tiny-cuda-nn)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/nerfstudioteam)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XwKq7qDQCQk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nSFsugarWzk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/h5EWiRRxYEQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8cv9G7izdPY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nerfstudio-project/nerfstudio/blob/main/colab/demo.ipynb) | 01.03.2024 |
    • MetaVoice - src?style=social)](https://github.com/metavoiceio/metavoice-src) <ul><li>[demo](https://ttsdemo.themetavoice.xyz/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/metavoiceio)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/MetaVoiceAI)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Y_k3bHPcPTo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gVKbf31hrYs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1UmjE1mzfG4td0rCjJEaAWGQXpn_GuwwY) | 26.02.2024 |
    • Lucas Beyer - research/big_vision?style=social)](https://github.com/google-research/big_vision) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2010.11929), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.04560), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.01601), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.01580), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2212.08013), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13035), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.17376), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.07915), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.16999), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.08242), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.07159)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/data), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/image_text/lit.ipynb) | 03.01.2024 |
    • Drengskapur
    • bazanovvanya - forever/music-composer?style=social)](https://github.com/ai-forever/music-composer) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.05858)</li><li>[blog post](https://habr.com/ru/company/sberbank/blog/583592/)</li><li>[data](https://magenta.tensorflow.org/datasets/maestro), [data](https://colinraffel.com//projects/lmd/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/gwinndr/MusicTransformer-Pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/bytedance/GiantMIDI-Piano), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mdeff/fma)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/music-composer/blob/master/src/Music_Composer_Demo_Colab_en.ipynb) | 20.12.2021 |
    • tmoneyx01 - client-py?style=social)](https://github.com/mdai/mdai-client-py) <ul><li>[annotator](https://public.md.ai/annotator/project/PVq9raBJ)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.md.ai/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mdai/ml-lessons/blob/master/lesson1-xray-images-classification.ipynb) | 07.03.2020 |
    • The Mosaic ML Team - best-practices-for-efficient-model-training)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](http://docs.mosaicml.com/)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/mosaicml)</li><li>[website](https://www.mosaicml.com/composer)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Amdahl's_law)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/@mosaicml6047/videos), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/n-1WV5QdIDc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Xi_5wq2MpOw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mosaicml/composer/blob/dev/examples/getting_started.ipynb) | 01.02.2024 |
    • Edouard Leurent - env?style=social)](https://github.com/eleurent/highway-env) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.03483), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.05701), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2101.07140)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://highway-env.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/eleurent/rl-agents), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/eleurent/finite-mdp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/baselines/tree/master/baselines/her)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/eleurent/highway-env/blob/master/scripts/parking_model_based.ipynb) | 03.01.2024 |
    • Sanchit Gandhi - whisper?style=social)](https://github.com/huggingface/distil-whisper) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2311.00430), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.17192)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/safetensors), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/collections/distil-whisper/training-datasets-6538d05c69721489d1db1e49), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoModelForSpeechSeq2Seq), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoProcessor), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/whisper#transformers.WhisperForConditionalGeneration.forward.example), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.GenerationMixin.generate.assistant_model), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#flashattention-2), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#bettertransformer)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/prompt-engineering/transcribing-audio-with-python-and-distil-whisper-9b4fec3d53bf)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/17vqtcb/p_distilwhisper_a_distilled_variant_of_whisper/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/46Q6fbdUCbg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SZtHEKyvuug), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/kI1pA1CADxM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/sanchit-gandhi/notebooks/blob/main/Distil_Whisper_Benchmark.ipynb) | 08.11.2023 |
    • Shishir Patil - llm/gorilla-cli)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/latinxinai/try-gorilla-a-large-language-model-connected-with-massive-apis-442f3b554ffb)</li><li>[project](http://gorilla.cs.berkeley.edu/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4EdyWkcddPc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RMgM3tPTpXI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/CX1Kzijq2TI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8AqQBPI4CFI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/iQwYoii4YiI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/alDArqcxSvw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/EypdTAlmoo4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LkV5DTRNxAg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP) | 06.04.2024 |
    • suno - ai/bark?style=social)](https://github.com/suno-ai/bark) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.03143), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2301.02111)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/J2B2vsjKuE)</li><li>[examples](https://suno-ai.notion.site/Bark-Examples-5edae8b02a604b54a42244ba45ebc2e2)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/encodec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/karpathy/nanoGPT)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/huggingface_hub/package_reference/environment_variables#hfhome)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/OnusFM)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/84LzaXAo6vE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rU5Do9yHbwM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w41-MUfxIWo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/_m-MxEpHUQY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dWWkZzvu7L9Bunq9zvD-W02RFUXoW-Pd) | 25.10.2023 |
    • Philipp Moritz - wang.github.io/)</li> <li>[Alexey Tumanov](https://faculty.cc.gatech.edu/~atumanov/)</li><details><summary>others</summary><li>[Richard Liaw](https://github.com/richardliaw)</li> <li>[Eric Liang](https://github.com/ericl)</li> <li>[Melih Elibol](https://research.nvidia.com/person/melih-elibol)</li> <li>[Zongheng Yang](https://zongheng.me/)</li> <li>[William Paul](https://github.com/Wapaul1)</li> <li>[Michael Jordan](https://people.eecs.berkeley.edu/~jordan/)</li> <li>[Ion Stoica](https://people.eecs.berkeley.edu/~istoica/)</li></ul></details> | [![](https://img.shields.io/github/stars/ray-project/ray?style=social)](https://github.com/ray-project/ray) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.05889), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.05072), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1712.09381), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.05118), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.03924)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.ray.io/en/latest/index.html)</li><li>[website](https://www.ray.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LmROEotKhJA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uzt-CwohQC8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/XME90SGL6Vs)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ray-project/ray/blob/master/doc/source/tune/examples/optuna_example.ipynb) | 06.09.2023 |
    • James Betker - tts?style=social)](https://github.com/neonbjb/tortoise-tts) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.12092), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.09672), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.07889)</li><li>[examples](https://nonint.com/static/tortoise_v2_examples.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/neonbjb/DL-Art-School)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/patrickvonplaten), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/osanseviero/tortoisse-tts)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J3-jfS29RF4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/neonbjb/tortoise-tts/blob/main/tortoise_tts.ipynb) | 15.07.2023 |
    • BigScience - workshop/petals?style=social)](https://github.com/bigscience-workshop/petals) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2209.01188), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.07258)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/borzunov/chat.petals.ml), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/timDettmers/bitsandbytes)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/bloom)</li><li>[project](https://petals.ml/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/BitTorrent)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Ervk6HPNS6AYVr3xVdQnY5a-TjjmLCdQ) | 05.07.2023 |
    • Stability AI - AI/StableLM?style=social)](https://github.com/Stability-AI/StableLM) <ul><li>[blog post](https://stability.ai/blog/stability-ai-launches-the-first-of-its-stablelm-suite-of-language-models)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/llama), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/tatsu-lab/stanford_alpaca), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/nomic-ai/gpt4all), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/databrickslabs/dolly), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/anthropics/hh-rlhf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/llama.cpp)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/lmsys/vicuna-13b-delta-v0), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/RyokoAI/ShareGPT52K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/stabilityai)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dypPSs4t77g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nWf1StvtoRw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Hg-s2RTaTFE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/qXtJjoEfTnA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stability-AI/StableLM/blob/main/notebooks/stablelm-alpha.ipynb) | 27.04.2023 |
    • Ross Wightman - 5b/), [data](https://laion.ai/blog/laion-400-open-dataset/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mlfoundations/wise-ft), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/webdataset/webdataset), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/webdataset/tarp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research-datasets/conceptual-12m)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion2B-en), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-B-32-laion2B-s34B-b79K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-L-14-laion2B-s32B-b82K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-H-14-laion2B-s32B-b79K), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/laion/CLIP-ViT-g-14-laion2B-s12B-b42K)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/mlfoundations/open_clip/blob/master/docs/Interacting_with_open_clip.ipynb) | 16.04.2023 |
    • Taku Kudo - smt/mosesdecoder/blob/master/scripts/tokenizer/tokenizer.perl), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rsennrich/subword-nmt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/gperftools/gperftools), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Microsoft/vcpkg)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://jacky2wong.medium.com/understanding-sentencepiece-under-standing-sentence-piece-ac8da59f6b08)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U51ranzJBpY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/sentencepiece/blob/master/python/sentencepiece_python_module_example.ipynb) | 21.05.2024 |
    • Damian Stewart - ai/InvokeAI/issues/2832)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/cactusfriend/nightmare-invokeai-prompts)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/damian0815/compel/blob/main/compel-demo.ipynb) | 26.01.2023 |
    • Aleksei Petrenko - huang.github.io/)</li> <li>[Tushar Kumar](https://github.com/tushartk)</li> <li>[Gaurav Sukhatme](http://robotics.usc.edu/~gaurav/)</li> <li>[Vladlen Koltun](http://vladlen.info/)</li></ul> | [![](https://img.shields.io/github/stars/alex-petrenko/sample-factory?style=social)](https://github.com/alex-petrenko/sample-factory) <ul><li>[ICML](http://proceedings.mlr.press/v119/petrenko20a.html)</li><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.11751)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://www.samplefactory.dev/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/alex-petrenko/faster-fifo)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/lLG17LKKSZc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/alex-petrenko/sample-factory/blob/master/sf_examples/notebooks/samplefactory_hub_example.ipynb) | 17.01.2023 |
    • Matthias Fey - team/pytorch_geometric?style=social)](https://github.com/pyg-team/pytorch_geometric) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1903.02428), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.07829), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1609.02907), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.03123), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.05178), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.08566), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10903), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1905.07953)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pytorch-geometric.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/snap-stanford/ogb/tree/master/examples), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/pyg-team/pyg-lib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_scatter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_sparse), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rusty1s/pytorch_cluster), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AntonioLonga/PytorchGeometricTutorial)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2018/hash/e77dbaf6759253c7c6d0efc5690369c7-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2017/hash/5dd9db5e033da9c6fb5ba83c7a7ebea9-Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://nips.cc/virtual/2020/public/poster_3fe230348e9a12c13120749e3f9fa4cd.html)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/beginner/basics/optimization_tutorial.html#full-implementation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLGMXrbDNfqTzqxB1IGgimuhtfAhGd8lHF), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLGMXrbDNfqTwPxitLVHEbT9Pd6-oR_cud), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-UjytpbqX4A)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1h3-vJGRVloF5zStxL5I0rSy4ZUPNsjy8) | 08.12.2022 |
    • Ilya Belikov - Text-to-Music?style=social)](https://github.com/MubertAI/Mubert-Text-to-Music) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mubert2.docs.apiary.io/)</li><li>[project](https://mubert.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YJu0iXn-T_U), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5UsaxJsFvAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/B0kkIpWifG4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ferluht/Mubert-Text-to-Music/blob/main/Mubert_Text_to_Music.ipynb) | 18.10.2022 |
    • Eugene Kharitonov - lee)</li> <li>[Ali Elkahky](https://scholar.google.com/citations?user=KB3S8RoAAAAJ)</li> <li>[Wei-Ning Hsu](https://wnhsu.github.io/)</li> <li>[Abdelrahman Mohamed](https://ai.facebook.com/people/abdelrahman-mohamed/)</li> <li>[Emmanuel Dupoux](http://www.lscp.net/persons/dupoux/)</li> <li>[Yossi Adi](https://www.cs.huji.ac.il/~adiyoss/)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/textlesslib?style=social)](https://github.com/facebookresearch/textlesslib) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.07359)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/waveglow), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/keithito/tacotron), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVIDIA/tacotron2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/pseeth/torch-stft)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/librispeech)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/textlesslib/blob/main/examples/resynthesis_and_continuation.ipynb) | 15.02.2022 |
    • nvidia - up-deep-learning-inference-using-tensorrt-updated/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.nvidia.com/deeplearning/tensorrt/)</li><li>[forum](https://forums.developer.nvidia.com/c/ai-data-science/deep-learning/tensorrt)</li><li>[website](https://developer.nvidia.com/tensorrt)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/TU5BMU6iYZ0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/6rZNLaS775w), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/G_KhUFCUSsY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7kJ-jph9gCw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/NVIDIA/TensorRT/blob/main/quickstart/IntroNotebooks/0.%20Running%20This%20Guide.ipynb) | 10.06.2021 |
    • James Bergstra - WgLkAAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/hyperopt/hyperopt?style=social)](https://github.com/hyperopt/hyperopt) <ul><li>[ICML](https://proceedings.mlr.press/v28/bergstra13.html)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](http://hyperopt.github.io/hyperopt/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-sklearn), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-nnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-nnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-convnet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hyperopt/hyperopt-gpsmbo)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2011/hash/86e8f7ab32cfd12577bc2619bc635690-Abstract.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Mp1xnPfE4PY), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tdwgR1AqQ8Y), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tteE_Vtmrv4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/hyperopt/hyperopt/blob/master/tutorial/01.BasicTutorial.ipynb) | 01.06.2021 |
    • Zeyu Chen - zxf)</li><details><summary>others</summary><li>[Jinxuan Qiu](https://github.com/kinghuin)</li> <li>[Yuhan Shen](https://github.com/ShenYuhan)</li> <li>[Yuying Hao](https://github.com/haoyuying)</li> <li>[Xiaojie Chen](https://github.com/KPatr1ck)</li></ul></details> | [![](https://img.shields.io/github/stars/PaddlePaddle/PaddleHub?style=social)](https://github.com/PaddlePaddle/PaddleHub) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://paddlehub.readthedocs.io/en)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleOCR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleDetection), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/CMU-Perceptual-Computing-Lab/openpose), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleSeg), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleClas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/ERNIE), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/baidu/LAC), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/baidu/DDParser), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/PaddlePaddle/PaddleSpeech)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/PaddlePaddle)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/analytics-vidhya/paddlehub-fdd1ec75a07b)</li><li>[website](https://www.paddlepaddle.org.cn/en)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9adXuF_lTSg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/PaddlePaddle/PaddleHub/blob/develop/demo/serving/bentoml/cloud-native-model-serving-with-bentoml.ipynb) | 20.04.2021 |
    • Silvia Terragni - fersini)</li> <li>[Antonio Candelieri](https://www.unimib.it/antonio-candelieri)</li> <li>[Pietro Tropeano](https://github.com/pietrotrope)</li><details><summary>others</summary><li>[Bruno Galuzzi](https://github.com/brunoG89)</li> <li>[Lorenzo Famiglini](https://github.com/lorenzofamiglini)</li> <li>[Davide Pietrasanta](https://github.com/davidepietrasanta)</li></ul></details> | [![](https://img.shields.io/github/stars/mind-Lab/octis?style=social)](https://github.com/mind-Lab/octis) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.01488)</li><li>[data](https://www.dbpedia.org/resources/ontology/), [data](https://www.statmt.org/europarl/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/estebandito22/PyTorchAVITM)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/a-beginners-guide-to-octis-optimizing-and-comparing-topic-models-is-simple-590554ec9ba6), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/a-beginners-guide-to-octis-vol-2-optimizing-topic-models-1214e58be1e5)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2000/hash/f9d1152547c0bde01830b7e8bd60024c-Abstract.html)</li><li>[paper](https://aclanthology.org/2021.eacl-demos.31/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/dataset/20-newsgroups)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nPmiWBFFJ8E)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/MIND-Lab/OCTIS/blob/master/examples/OCTIS_Optimizing_CTM.ipynb) | 19.04.2021 |
    • Haoqi Fan - wA73gAAAAJ)</li> <li>[Aaron Adcock](https://scholar.google.com/citations?&user=oa78zHUAAAAJ)</li> <li>[Wan-Yen Lo](https://github.com/wanyenlo)</li> <li>[Christoph Feichtenhofer](http://feichtenhofer.github.io/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3474085.3478329)](https://doi.org/10.1145/3474085.3478329) [![](https://img.shields.io/github/stars/facebookresearch/pytorchvideo?style=social)](https://github.com/facebookresearch/pytorchvideo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.09887), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2104.11227)</li><li>[blog post](https://ai.facebook.com/blog/pytorchvideo-a-deep-learning-library-for-video-understanding/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://pytorchvideo.readthedocs.io/en/latest/index.html)</li><li>[website](https://github.com/facebookresearch/pytorchvideo)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/b7-gnpqz9Qg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/pytorchvideo/blob/main/tutorials/accelerator/Build_your_model_with_PytorchVideo_Accelerator.ipynb) | 13.04.2021 |
    • Weikang Song - learning)</li><li>[tensor encoding](http://jakubkonecny.com/files/tensor_encoding.pdf)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/simulation/datasets/emnist), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/federated/api_docs/python/tff/learning/build_federated_averaging_process)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/tff_for_federated_learning_research_compression.ipynb) | 28.06.2024 |
    • autodistill - grounded-sam), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolov8), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolonas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-yolov5), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-detr), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-detic), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-grounding-dino), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-owl-vit), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-sam-clip), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-llava), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-kosmos-2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-owlv2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-roboflow-universe), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-azure-vision), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-rekognition), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/autodistill/autodistill-gcp-vision), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/roboflow/inference)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gKTYMfwPo4M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/M_QZ_Q0zT0k), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtube.com/roboflow)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/how-to-auto-train-yolov8-model-with-autodistill.ipynb) | 31.01.2024 |
    • cleanlab - examples)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://cleanlab.ai/slack)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/CleanlabAI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/cleanlab/cleanvision/blob/main/docs/source/tutorials/tutorial.ipynb) | 13.02.2024 |
    • Killian Lucas - interpreter?style=social)](https://github.com/KillianLucas/open-interpreter) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/6p3fD6rBVm)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.openinterpreter.com/)</li><li>[website](https://openinterpreter.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SqnXUHwIa3c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/s-f4lCETxu0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J-H2un1Adr0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jaijpff58vw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7KFbG_3dKKs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4OhuFjPyZNQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/01tQLn_RRcE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/uyfoHQVgeY0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1WKmRXZgsErej2xUriKzxrEAXdxMSgWbb) | 03.01.2024 |
    • Leandro von Werra - human-preferences)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/xQ5nc1CF7iQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/67SO20dszNA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/trl/blob/master/examples/notebooks/best_of_n.ipynb) | 14.07.2023 |
    • Google - 9MYvPwI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/MXxN4fv01c8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FsxthdQ_sL4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/zEOtG-ChmZE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kBjYK3K3P6M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/8j1MWZGNoXM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/hszd5UqnfLk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tpu/blob/master/tools/colab/keras_mnist_tpu.ipynb) | 20.12.2022 |
    • John Lalor - Graber](https://github.com/ezubaric)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346) [![](https://img.shields.io/github/stars/nd-ball/py-irt?style=social)](https://github.com/nd-ball/py-irt) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1908.11421)</li><li>[paper](https://www.frontiersin.org/articles/10.3389/fpsyg.2016.01422/full)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/akUxtt21Mlc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/nd-ball/py-irt/blob/master/examples/py-irt_example.ipynb) | 30.06.2022 |
    • Junnan Li - li/home)</li> <li>[Caiming Xiong](http://cmxiong.com/)</li> <li>[Steven Hoi](https://sites.google.com/view/stevenhoi)</li></ul> | [![](https://img.shields.io/github/stars/salesforce/BLIP?style=social)](https://github.com/salesforce/BLIP) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2201.12086)</li><li>[blog post](https://blog.salesforceairesearch.com/blip-bootstrapping-language-image-pretraining/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/ALPRO), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dmlc/decord), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/ALBEF), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rwightman/pytorch-image-models/tree/main/timm)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X2k7n4FuI7c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/salesforce/BLIP/blob/main/demo.ipynb) | 02.03.2022 |
    • Katherine Crowson - bomze)</li></ul> | [![](https://img.shields.io/github/stars/chigozienri/VQGAN-CLIP-animations?style=social)](https://github.com/chigozienri/VQGAN-CLIP-animations) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCToztRy9FSTIhEen_1x4FAw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tg-bomze/collection-of-notebooks/blob/master/Text2Animation.ipynb) | 29.09.2021 |
    • Mikael Alafriz - sonic-dreams?style=social)](https://github.com/mikaelalafriz/lucid-sonic-dreams) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NVlabs/stylegan2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/justinpinkney/awesome-pretrained-stylegan2)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/introducing-lucid-sonic-dreams-sync-gan-art-to-music-with-a-few-lines-of-python-code-b04f88722de1)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/l-nGC-ve7sI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1Y5i50xSFIuN3V4Md8TB30_GOAtts7RQD) | 24.08.2021 |
    • Arseniy Shakhmatov - forever/Kandinsky-2?style=social)](https://github.com/ai-forever/Kandinsky-2) <ul><li>[blog post](https://habr.com/ru/companies/sberbank/articles/725282/)</li><li>[demo](https://editor.fusionbrain.ai/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sberbank-ai/Kandinsky_2.1)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/LZvp4SWcCao), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/IoPhRE37XSU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/dYt9xJ7dnpU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rN2J5TL2RZ0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1xSbu-b-EwYd6GdaFPRVgvXBX_mciZ41e) | 07.08.2023 |
    • Willem Pienaar - dev/feast?style=social)](https://github.com/feast-dev/feast) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.feast.dev/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/baineng/feast-hive), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Shopify/feast-trino), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Azure/feast-azure), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/amundsen-io/amundsen/blob/main/databuilder/databuilder/extractor/feast_extractor.py)</li><li>[website](https://feast.dev/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/DaNv-Wf1MBA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/p2cuq4eJ2BY)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/feast-dev/feast/blob/master/examples/quickstart/quickstart.ipynb) | 28.02.2024 |
    • OpenMMLab - mmlab/mmagic?style=social)](https://github.com/open-mmlab/mmagic) <ul><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/raweFPmdzG)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://mmagic.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmgeneration), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmengine/blob/main/mmengine/model/wrappers/seperate_distributed.py), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mmcv), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/open-mmlab/mim)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://openmmlab.medium.com/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/OpenMMLab)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/openmmlab)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/open-mmlab/mmagic/blob/main/demo/mmagic_inference_tutorial.ipynb) | 11.09.2023 |
    • Benjamin Lefaudeux - caggiano.github.io/)</li> <li>[Sean Naren](https://github.com/SeanNaren)</li> <li>[Min Xu](https://github.com/min-xu-ai)</li> <li>[Jieru Hu](https://github.com/jieru-hu)</li> <li>[Marta Tintore](https://github.com/MartaTintore)</li> <li>[Susan Zhang](https://suchenzang.github.io/)</li> <li>[Patrick Labatut](https://github.com/patricklabatut)</li> <li>[Daniel Haziza](https://scholar.google.com/citations?user=2eSKdFMAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/xformers?style=social)](https://github.com/facebookresearch/xformers) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://facebookresearch.github.io/xformers/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/sputnik), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hgyhungry/ge-spmm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/triton), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RobinBruegger/RevTorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/mlpen/Nystromformer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairscale), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/pytorch-image-models), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/NJyZCdxnGe4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/xformers/blob/main/docs/source/xformers_mingpt.ipynb) | 11.08.2023 |
    • svc develop team - develop-team/so-vits-svc?style=social)](https://github.com/svc-develop-team/so-vits-svc) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/NaruseMioShirakana/MoeVoiceStudio), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/DiffSinger/tree/refactor/modules/nsf_hifigan), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/auspicious3000/contentvec), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/yxlllc/DDSP-SVC), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/flutydeer/audio-slicer), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openvpi/audio-slicer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/NaruseMioShirakana/MoeSS-SUBModel/tree/main)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/svc-develop-team/so-vits-svc/blob/4.1-Stable/sovits4_for_colab.ipynb) | 31.07.2023 |
    • Yuan-Chen Guo - Tian Liu](https://github.com/thuliu-yt16)</li> <li>[Ruizhi Shao](https://github.com/DSaurus)</li> <li>[Christian Laforte](https://github.com/claforte)</li><details><summary>others</summary><li>[Vikram Voleti](https://github.com/voletiv)</li> <li>[Guan Luo](https://github.com/logan0601)</li> <li>[Chia-Hao Chen](https://scholar.google.com/citations?user=X0zirvMAAAAJ)</li> <li>[Zi-Xin Zou](https://github.com/zouzx)</li> <li>[Chen Wang](https://cwchenwang.github.io/)</li> <li>[Yanpei Cao](https://yanpei.me/)</li> <li>[Song-Hai Zhang](https://scholar.google.com/citations?user=AWtV-EQAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/threestudio-project/threestudio?style=social)](https://github.com/threestudio-project/threestudio) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.15413), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.16213), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.10440)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/ejer2MAB8N)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DSaurus/Tensor4D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/eladrich/latent-nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Gorilla-Lab-SCUT/Fantasia3D), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cvlab-columbia/zero123), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/guochengqian/Magic123), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ayaanzhaque/instruct-nerf2nerf), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/KAIR-BAIR/nerfacc), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lightning-AI/lightning), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ashawkey/fantasia3d.unofficial)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/DeepFloyd/IF-I-XL-v1.0), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/huggingface_hub/v0.14.1/guides/download#download-an-entire-repository)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1635cb0/threestudio_a_unified_framework_for_3d_content/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gT8Xvx5b6IE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/threestudio-project/threestudio/blob/main/threestudio.ipynb) | 28.07.2023 |
    • Alex Shonenkov - ai)</li> <li>[Daria Bakshandaeva](https://github.com/Gugutse)</li> <li>[Christoph Schuhmann](http://christoph-schuhmann.de/)</li><details><summary>others</summary><li>[Ksenia Ivanova](https://github.com/ivksu)</li> <li>[Nadiia Klokova](https://github.com/vauimpuls)</li></ul></details> | [![](https://img.shields.io/github/stars/deep-floyd/IF?style=social)](https://github.com/deep-floyd/IF) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.11487)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/umz62Mgr)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/DeepFloyd), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/optimization/fp16#model-offloading-for-fast-inference-and-memory-savings), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-speed), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/api/pipelines/if#optimizing-for-memory), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/blog/if), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/docs/diffusers/main/en/api/pipelines/if)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/code/shonenkov/deepfloyd-if-4-3b-generator-of-pictures)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/deepfloydai)</li><li>[website](https://deepfloyd.ai/deepfloyd-if)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/4Zkipll5Rjc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tq5ZXZWwTPA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rLtfd1TvYJk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/deepfloyd_if_free_tier_google_colab.ipynb) | 26.06.2023 |
    • IDEA-Research - Research/Grounded-Segment-Anything?style=social)](https://github.com/IDEA-Research/Grounded-Segment-Anything) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2304.02643), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2303.05499)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/MasterBin-IIAU/UNINEXT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OSX), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/dvlab-research/VoxelNeXt), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Semantic-SAM), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/UX-Decoder/Segment-Everything-Everywhere-All-At-Once), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/IDEA-Research/OpenSeeD), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Computer-Vision-in-the-Wild/CVinW_Readings), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/sail-sg/EditAnything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/feizc/IEA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Li-Qingyun/sam-mmrotate), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/VainF/Awesome-Anything), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/RockeyCoss/Prompt-Segment-Anything)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oEQYStnF2l8), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gKTYMfwPo4M), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0Fpb8TBH0nM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/GuEDDBWrN24)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/betogaona7/Grounded-Segment-Anything/blob/main/grounded_sam_colab_demo.ipynb) | 12.04.2023 |
    • Fatih Cagatay Akyon - badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990) [![](https://img.shields.io/github/stars/obss/sahi?style=social)](https://github.com/obss/sahi) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.06934)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fcakyon/small-object-detection-benchmark)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?pipeline_tag=object-detection&sort=downloads)</li><li>[<img src="images/kaggle.svg" alt="kaggle" height=20/>](https://www.kaggle.com/remekkinas/sahi-slicing-aided-hyper-inference-yv5-and-yx)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codable/sahi-a-vision-library-for-performing-sliced-inference-on-large-images-small-objects-c8b086af3b80), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/codable/convert-any-dataset-to-coco-object-detection-format-with-sahi-95349e1fe2b7)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/obss/sahi/blob/main/demo/inference_for_yolov5.ipynb) | 23.02.2023 |
    • Luca Costabello - Abstract.html), [<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2013/hash/b337e84de8752b27eda3a12363109e80-Abstract.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/gX_KHaU8ChI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Accenture/AmpliGraph/blob/main/docs/tutorials/AmpliGraphBasicsTutorial.ipynb) | 23.02.2023 |
    • Gengshan Yang - y/rigidmask), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ShichenLiu/SoftRas), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ThibaultGROUEIX/ChamferDistancePytorch)</li><li>[project](https://banmo-www.github.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1NUa-yvFGA0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jDTy-liFoCQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1dQJn1vsuz0DkyRZbOA1SulkVQ0V1kMUP) | 30.12.2022 |
    • Anton Emelyanov - forever/ru-gpts?style=social)](https://github.com/ai-forever/ru-gpts) <ul><li>[cristofari](https://sbercloud.ru/ru/christofari)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeedExamples/tree/master/Megatron-LM)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/transformers/main_classes/model.html#transformers.generation_utils.GenerationMixin.generate)</li><li>[sparse attention](https://www.deepspeed.ai/tutorials/sparse-attention/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/ai-forever/ru-gpts/blob/master/examples/ruGPT3XL_generation.ipynb) | 07.12.2022 |
    • Caglar Gulcehre - Arnold](http://gabe.squirrelsoup.net/)</li> <li>[Jerry Li](https://github.com/jerryli27)</li> <li>[Mohammad Norouzi](https://norouzi.github.io/)</li> <li>[Matt Hoffman](https://www.mwhoffman.com/)</li> <li>[Ofir Nachum](https://scholar.google.com/citations?user=C-ZlBWMAAAAJ)</li> <li>[George Tucker](https://sites.google.com/view/gjt)</li> <li>[Nicolas Heess](https://scholar.google.com/citations?user=79k7bGEAAAAJ)</li> <li>[Nando de Freitas](https://github.com/nandodf)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/deepmind-research?style=social)](https://github.com/deepmind/deepmind-research/tree/master/rl_unplugged) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.13888), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.04543), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1709.06009), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.09656), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.11711), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1909.12238), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.09451), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.00690), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2003.11881), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.09575)</li><li>[data](https://console.cloud.google.com/storage/browser/rl_unplugged)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/lab), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/realworldrl_suite#installation)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/n8yNYzbUMJ0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/deepmind_research/blob/master/rl_unplugged/dmlab_r2d2.ipynb) | 26.05.2022 |
    • Erik Nijkamp - tu.github.io/)</li><details><summary>others</summary><li>[Huan Wang](https://huan-december.github.io/)</li> <li>[Yingbo Zhou](https://scholar.google.com/citations?user=H_6RQ7oAAAAJ)</li> <li>[Silvio Savarese](https://cvgl.stanford.edu/silvio/)</li> <li>[Caiming Xiong](http://cmxiong.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/salesforce/CodeGen?style=social)](https://github.com/salesforce/CodeGen) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.13474), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.02309)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/salesforce/jaxformer)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/models?search=salesforce+codegen)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1fQI8OgzMAR0bquCrvhlAtXSw6iMFbVgI) | 23.04.2022 |
    • Corentin Jemine - Ochir Tuguldur](https://github.com/tugstugi)</li></ul> | [![](https://img.shields.io/github/stars/CorentinJ/Real-Time-Voice-Cloning?style=social)](https://github.com/CorentinJ/Real-Time-Voice-Cloning) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1806.04558), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.08435), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1703.10135), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1710.10467)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/fatchord/WaveRNN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/coqui-ai/tts), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/resemble-ai/Resemblyzer)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-O_hYhToKoA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tugstugi/dl-colab-notebooks/blob/master/notebooks/RealTimeVoiceCloning.ipynb) | 07.03.2022 |
    • Chi Wang - wu.github.io/)</li></ul> | [![](https://img.shields.io/github/stars/microsoft/FLAML?style=social)](https://github.com/microsoft/FLAML) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2106.04815), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.01571)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://microsoft.github.io/FLAML/)</li><li>[paper](https://www.microsoft.com/en-us/research/publication/flaml-a-fast-and-lightweight-automl-library/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/channel/UCfU0zfFXHXdAd5x-WvFBk5A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/euXpDYGgkGM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/microsoft/FLAML/blob/master/notebook/flaml_automl.ipynb) | 17.12.2021 |
    • Pablo Castro - 2.0.html)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://google.github.io/dopamine/docker/)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://google.github.io/dopamine/docs/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/atari-py#roms), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/mujoco-py#install-mujoco)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/the-21st-century/google-dopamine-new-rl-framework-f84a35b7fb3f)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/live/FWFoyFjeAaM?feature=share), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/bd4CsDp00RA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/dopamine/blob/master/dopamine/colab/jax_agent_visualizer.ipynb) | 03.08.2020 |
    • Dale Markowitz - learning-for-sports)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://manivannan-ai.medium.com/find-the-angle-between-three-points-from-2d-using-python-348c513e2cd)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=yLrOy2Xedgk)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/making_with_ml/blob/master/sports_ai/Sports_AI_Analysis.ipynb) | 14.07.2020 |
    • Takuya Akiba - votte)</li> <li>[Toshihiko Yanase](https://github.com/toshihikoyanase)</li> <li>[Takeru Ohta](https://github.com/sile)</li> <li>[Masanori Koyama](https://scholar.google.com/citations?user=oY1gA10AAAAJ)</li></ul> | [![](https://img.shields.io/github/stars/optuna/optuna?style=social)](https://github.com/optuna/optuna) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1907.10902)</li><li>[<img src="images/docker.svg" alt="docker" height=20/>](https://hub.docker.com/r/optuna/optuna)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://optuna.readthedocs.io/en/stable/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/optuna/optuna-dashboard)</li><li>[website](https://optuna.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J_aymk4YXhg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tcrcLRopTX0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-UeC4MR3PHM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/oC8zFYcfYXU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/optuna/optuna-examples/blob/main/quickstart.ipynb) | 15.02.2024 |
    • Alexey Bochkovskiy - the-most-accurate-real-time-neural-network-on-ms-coco-dataset-73adfd3602fe), [<img src="images/medium.svg" alt="medium" height=20/>](https://alexeyab84.medium.com/scaled-yolo-v4-is-the-best-neural-network-for-object-detection-on-ms-coco-dataset-39dfa22fa982)</li><li>[project](https://pjreddie.com/darknet/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/gydxzd/p_yolov4_the_most_accurate_realtime_neural/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/1_SiUOYUoOI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YDFf-TqJOFE)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_GdoqCJWXsChrOiY8sZMr_zbr_fH-0Fg) | 25.06.2020 |
    • Julien Valentin - keskin)</li> <li>[Pavel Pidlypenskyi](https://github.com/podlipensky)</li> <li>[Ameesh Makadia](https://github.com/amakadia)</li><details><summary>others</summary><li>[Avneesh Sud](https://github.com/avneesh-g)</li> <li>[Sofien Bouaziz](http://sofienbouaziz.com/)</li></ul></details> | [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450508.3464595)](https://doi.org/10.1145/3450508.3464595) [![](https://img.shields.io/github/stars/tensorflow/graphics?style=social)](https://github.com/tensorflow/graphics) <ul><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/computer-graphics-computer-vision-tensorflow-graphics-110e955e26bb)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/graphic)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/_TFGraphics_)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Un0JDL3i5Hg)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/graphics/blob/master/tensorflow_graphics/notebooks/6dof_alignment.ipynb) | 20.05.2020 |
    • Andrey Nikishaev - learning-world/tutorial-making-road-traffic-counting-app-based-on-computer-vision-and-opencv-166937911660)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=_o5iLbRHKao)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/12N4m_RYKqrpozRzh9qe7nQE_sIqQH9U8) | 10.01.2020 |
    • François Chollet - learning)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Transfer_learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/images/transfer_learning.ipynb) | 26.06.2024 |
    • Loïc Barrault - An Chung](https://iamyuanchung.github.io/)</li> <li>[Mariano Coria](https://www.linkedin.com/in/marianocoria)</li> <li>[David Dale](https://daviddale.ru/)</li><details><summary>others</summary><li>[Ning Dong](https://scholar.google.com/citations?user=gg1hvjoAAAAJ)</li> <li>[Mark Duppenthaler](https://github.com/mduppes)</li> <li>[Paul-Ambroise Duquenne](https://scholar.google.com/citations?user=Uah8IcAAAAAJ)</li> <li>[Hady Elsahar](https://www.hadyelsahar.io/)</li> <li>[Min-Jae Hwang](https://mjhwang93.github.io/)</li> <li>[Hirofumi Inaguma](https://hirofumi0810.github.io/)</li> <li>[Ilia Kulikov](https://github.com/uralik)</li> <li>[Pengwei Li](https://scholar.google.com/citations?user=hQB3YsYAAAAJ)</li> <li>[Daniel Licht](https://github.com/Lichtphyz)</li> <li>[Jean Maillard](https://scholar.google.com/citations?user=_ewOoK0AAAAJ)</li> <li>[Ruslan Mavlyutov](https://github.com/mavlyutovr)</li> <li>[Kaushik Ram Sadagopan](https://github.com/kauterry)</li> <li>[Abinesh Ramakrishnan](https://github.com/ibanesh)</li> <li>[Tuan Tran](https://antoine-tran.github.io/)</li> <li>[Guillaume Wenzek](https://github.com/gwenzek)</li> <li>[Yilin Yang](https://yilinyang7.github.io/)</li> <li>[Ethan Ye](https://github.com/yeyinthtoon)</li> <li>[Ivan Evtimov](https://ivanevtimov.eu/)</li> <li>[Pierre Fernandez](https://pierrefdz.github.io/)</li> <li>[Robin San Roman](https://scholar.google.com/citations?user=AJ3ir84AAAAJ)</li> <li>[Bokai Yu](https://scholar.google.com/citations?user=7jNmPwUAAAAJ)</li> <li>[Pierre Andrews](https://github.com/Mortimerp9)</li> <li>[Can Balioglu](http://canbalioglu.com/)</li> <li>[Peng-Jen Chen](https://scholar.google.com/citations?user=rOXs9VMAAAAJ)</li> <li>[Marta Costa-jussà](https://costa-jussa.com/)</li> <li>[Maha Elbayad](http://elbayadm.github.io/)</li> <li>[Hongyu Gong](https://github.com/hygong-fb)</li> <li>[Francisco Guzmán](https://guzmanhe.github.io/)</li> <li>[Kevin Heffernan](https://github.com/heffernankevin)</li> <li>[Somya Jain](https://scholar.google.com/citations?user=AmBxU3kAAAAJ)</li> <li>[Justine Kao](https://scholar.google.com/citations?user=Y9BLeTAAAAAJ)</li> <li>[Ann Lee](https://www.stat.cmu.edu/~annlee/)</li> <li>[Xutai Ma](https://github.com/xutaima)</li> <li>[Benjamin Peloquin](https://scholar.google.com/citations?user=5GNAjB8AAAAJ)</li> <li>[Juan Pino](https://scholar.google.com/citations?user=weU_-4IAAAAJ)</li> <li>[Sravya Popuri](https://scholar.google.com/citations?user=MtmqG3UAAAAJ)</li> <li>[Holger Schwenk](https://github.com/hoschwenk)</li> <li>[Anna Sun](https://github.com/annasun28)</li> <li>[Paden Tomasello](https://scholar.google.com/citations?user=sBtWMGYAAAAJ)</li> <li>[Changhan Wang](https://www.changhan.me/)</li> <li>[Skyler Wang](https://www.skylerwang.com/)</li> <li>[Mary Williamson](https://scholar.google.com/citations?user=Ys4xB-QAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/facebookresearch/seamless_communication?style=social)](https://github.com/facebookresearch/seamless_communication) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2312.05187)</li><li>[blog post](https://ai.meta.com/research/seamless-communication/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/libsndfile/libsndfile), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/fairseq2), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SimulEval), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/stopes), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/facebookresearch/SONAR)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-m4t-v2-large), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-expressive), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/seamless-streaming)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://ngwaifoong92.medium.com/beginners-guide-to-seamlessm4t-81efad6e8ca6)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=0padjtkHXTE), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rNN7qsoCKBo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/RKEFZ44YOcc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/facebookresearch/seamless_communication/blob/main/Seamless_Tutorial.ipynb) | 14.12.2023 |
    • Albert Jiang - maria-lengyel)</li> <li>[Guillaume Lample](https://github.com/glample)</li> <li>[Lucile Saulnier](https://scholar.google.com/citations?user=Baj_9IsAAAAJ)</li> <li>[Lélio Renard Lavaud](https://github.com/lerela)</li> <li>[Marie-Anne Lachaux](https://scholar.google.com/citations?user=dSEMIJ8AAAAJ)</li> <li>[Pierre Stock](https://github.com/pierrestock)</li> <li>[Teven Scao](https://scholar.google.com/citations?user=ik0_vxsAAAAJ)</li> <li>[Thibaut Lavril](https://scholar.google.com/citations?user=9nPunCEAAAAJ)</li> <li>[Thomas Wang](https://github.com/thomasw21)</li> <li>[Timothée Lacroix](https://scholar.google.com/citations?&user=tZGS6dIAAAAJ)</li> <li>[William Sayed](https://www.linkedin.com/in/william-el-sayed-48672312a)</li></ul></details> | [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.06825), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1904.10509), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2004.05150), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2306.05685)</li><li>[blog post](https://mistral.ai/news/announcing-mistral-7b/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.com/invite/mistralai)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.mistral.ai/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/vllm-project/vllm), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lm-sys/FastChat), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ggerganov/ggml), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Dao-AILab/flash-attention), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/skypilot-org/skypilot)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/mistralai)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/mistral-7b-recipes-for-fine-tuning-and-quantization-on-your-computer-631401583f77)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g7kVVBlCGo0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ASpageg8nPw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/OMIuP6lQXe4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/jnPZApwtE4I), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/3SdopNwQJ-c)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/Mistral-colab/blob/main/Mistral_colab.ipynb) | 09.10.2023 |
    • Boris Dayma - suraj)</li> <li>[Pedro Cuenca](https://github.com/pcuenca)</li> <li>[Khalid Saifullah](https://khalidsaifullaah.github.io/)</li><details><summary>others</summary><li>[Tanishq Abraham](https://github.com/tmabraham)</li> <li>[Phúc H. Lê Khắc](https://lkhphuc.com/)</li> <li>[Luke Melas](https://lukemelas.github.io/)</li> <li>[Ritobrata Ghosh](https://ghosh-r.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/borisdayma/dalle-mini?style=social)](https://github.com/borisdayma/dalle-mini) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2102.08981), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1910.13461), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.09841), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.04015)</li><li>[blog post](https://wandb.ai/dalle-mini/dalle-mini/reports/DALL-E-mini--Vmlldzo4NjIxODA)</li><li>[data](https://aclanthology.org/P18-1238/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/huggingface/transformers/tree/master/examples/research_projects/jax-projects), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/CLIP/blob/main/data/yfcc100m.md)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/spaces/flax-community/dalle-mini)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/borisdayma/dalle-mini/blob/main/tools/inference/inference_pipeline.ipynb) | 22.08.2023 |
    • Vincent Stimper - cr)</li> <li>[Vincent Berenz](http://vincentberenz.is.tuebingen.mpg.de/)</li><details><summary>others</summary><li>[Lukas Ryll](https://github.com/lukasryll)</li> <li>[Bernhard Schölkopf](https://scholar.google.com/citations?user=DZ-fHPgAAAAJ)</li> <li>[José Miguel Hernández-Lobato](https://jmhl.org/)</li></ul></details> | [![](https://img.shields.io/github/stars/VincentStimper/normalizing-flows?style=social)](https://github.com/VincentStimper/normalizing-flows) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2302.12014)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://vincentstimper.github.io/normalizing-flows/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/VincentStimper/resampled-base-flows), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/VincentStimper/hmc-hyperparameter-tuning)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Von_Mises_distribution)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/VincentStimper/normalizing-flows/blob/master/examples/paper_example_nsf_colab.ipynb) | 26.06.2023 |
    • Eren Gölge - AlJafari](https://github.com/Aya-AlJafari)</li> <li>[Edresson Casanova](https://github.com/Edresson)</li> <li>[Josh Meyer](http://jrmeyer.github.io/)</li><details><summary>others</summary><li>[Kelly Davis](https://github.com/kdavis-coqui)</li> <li>[Reuben Morais](https://github.com/reuben)</li></ul></details> | [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS) <ul><li>[blog post](https://coqui.ai/blog/tts/solving-attention-problems-of-tts-models-with-double-decoder-consistency)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://tts.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/coqui-ai/TTS-papers)</li><li>[samples](https://erogol.github.io/ddc-samples/)</li><li>[website](https://coqui.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ADnBCz0Wd1U), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Yglxf2WbkLU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/alpI-DnVlO0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/coqui-ai/TTS/blob/dev/notebooks/Tutorial_2_train_your_first_TTS_model.ipynb) | 26.04.2023 |
    • Andreas Köpf - schuhmann.de/)</li><details><summary>others</summary><li>[Keith Stevens](https://fozziethebeat.github.io/)</li> <li>[Abdullah Barhoum](https://github.com/AbdBarho)</li> <li>[Nguyen Minh Duc](https://github.com/notmd)</li> <li>[Oliver Stanley](https://olliestanley.github.io/)</li> <li>[James Melvin Ebenezer](https://github.com/melvinebenezer)</li></ul></details> | [![](https://img.shields.io/github/stars/LAION-AI/Open-Assistant?style=social)](https://github.com/LAION-AI/Open-Assistant) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2203.02155)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://projects.laion.ai/Open-Assistant/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/OpenAssistant)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://generativeai.pub/open-assistant-a-free-and-open-source-alternative-to-chatgpt-67d15229813)</li><li>[website](https://open-assistant.io/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/64Izfm24FKA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ddG2fM9i4Kk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/FQIHLFLrTw0)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/LAION-AI/Open-Assistant/blob/main/notebooks/data-augmentation/stackexchange-builder/stackexchange-builder.ipynb) | 14.01.2023 |
    • Sergio Guadarrama - g.github.io/)</li><details><summary>others</summary><li>[Ethan Holly](https://github.com/eholly-g)</li> <li>[Sam Fishman](http://sam.fish/)</li> <li>[Ke Wang](https://scholar.google.com/citations?user=QRYX59sAAAAJ)</li> <li>[Ekaterina Gonina](https://github.com/egonina)</li> <li>[Neal Wu](https://twitter.com/WuNeal)</li> <li>[Efi Kokiopoulou](https://github.com/efiko)</li> <li>[Luciano Sbaiz](https://scholar.google.com/citations?user=fKBmhcUAAAAJ)</li> <li>[Jamie Smith](https://scholar.google.com/citations?user=jk17mo8AAAAJ)</li> <li>[Gábor Bartók](https://github.com/bartokg)</li> <li>[Jesse Berent](https://www.linkedin.com/in/jesse-berent-a1b6875)</li> <li>[Chris Harris](https://www.linkedin.com/in/charris)</li> <li>[Vincent Vanhoucke](https://vincent.vanhoucke.com/)</li> <li>[Eugene Brevdo](https://ebrevdo.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/tensorflow/agents?style=social)](https://github.com/tensorflow/agents) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://www.tensorflow.org/agents/api_docs/python/tf_agents)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/introduction-to-tf-agents-a-library-for-reinforcement-learning-in-tensorflow-68ab9add6ad6), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/analytics-vidhya/tf-agents-a-flexible-reinforcement-learning-library-for-tensorflow-5f125420f64b)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/agents)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/2nKD6zFQ8xI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-TTziY7EmUA), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/52DTXidSVWc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/U7g7-Jzj9qo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/tAOApRQAgpc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/X4eruXqNbDc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/g0yDlAbi6Pc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VmZI_YkfPBM), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/7QFSziiAnxI)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/agents/blob/master/docs/tutorials/0_intro_rl.ipynb) | 15.12.2022 |
    • Filippo Vicentini - i-szabo)</li> <li>[Dian Wu](https://github.com/wdphy16)</li><details><summary>others</summary><li>[Christopher Roth](https://github.com/chrisrothUT)</li> <li>[Clemens Giuliani](https://github.com/inailuig)</li> <li>[Gabriel Pescia](https://github.com/gpescia)</li> <li>[Jannes Nys](https://github.com/jwnys)</li> <li>[Vladimir Vargas-Calderón](https://github.com/VolodyaCO)</li> <li>[Nikita Astrakhantsev](https://github.com/nikita-astronaut)</li> <li>[Giuseppe Carleo](https://github.com/gcarleo)</li> <li>[Kenny Choo](https://github.com/kchoo1118)</li> <li>[James Smith](https://jamesetsmith.github.io/)</li> <li>[Tom Westerhout](https://github.com/twesterhout)</li> <li>[Fabien Alet](https://github.com/fabienalet)</li> <li>[Emily Davis](https://github.com/emilyjd)</li> <li>[Stavros Efthymiou](https://github.com/stavros11)</li> <li>[Ivan Glasser](https://www.researchgate.net/profile/Ivan-Glasser)</li> <li>[Sheng-Hsuan Lin](https://shhslin.github.io/)</li> <li>[Marta Mauri](https://github.com/martamau)</li> <li>[Mazzola Guglielmo](https://www.ics.uzh.ch/en/research/research-groups/Guglielmo-Mazzola0.html)</li> <li>[Christian Mendl](http://christian.mendl.net/)</li> <li>[Evert Nieuwenburg](https://evert.info/)</li> <li>[Ossian O'Reilly](https://github.com/ooreilly)</li> <li>[Hugo Théveniaut](https://github.com/theveniaut)</li> <li>[Giacomo Torlai](https://github.com/GTorlai)</li> <li>[Alexander Wietek](https://awietek.github.io/)</li></ul></details> | [![](https://img.shields.io/github/stars/netket/netket?style=social)](https://github.com/netket/netket) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2112.10526)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://netket.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/mpi4jax/mpi4jax), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cloudhan/jax-windows-builder)</li><li>[website](https://www.netket.org/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Ryz-o71tuy8)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/PhilipVinc/Lectures/blob/main/2202_NetKet/01_intro.ipynb) | 15.09.2022 |
    • Sabela Ramos - vincent-1958381)</li><details><summary>others</summary><li>[Hanna Yakubovich](https://github.com/yakubanna)</li> <li>[Daniel Toyama](https://github.com/kenjitoyama)</li> <li>[Anita Gergely](https://www.linkedin.com/in/anita-g-318064b2/)</li> <li>[Piotr Stanczyk](https://scholar.google.com/citations?user=fKVK0dYAAAAJ)</li> <li>[Raphaël Marinier](https://github.com/RaphaelMarinier)</li> <li>[Jeremiah Harmsen](https://github.com/jharmsen)</li> <li>[Olivier Pietquin](https://research.google/people/105812/)</li> <li>[Nikola Momchev](https://scholar.google.com/citations?user=PbWgaswAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/google-research/rlds?style=social)](https://github.com/google-research/rlds) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2111.02767)</li><li>[blog post](https://ai.googleblog.com/2021/12/rlds-ecosystem-to-generate-share-and.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/envlogger), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/google-research/rlds-creator), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/D4RL), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/deepmind/dm_env/blob/master/docs/index.md)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](http://www.tensorflow.org/datasets/catalog/overview), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/robosuite_panda_pick_place_can), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/locomotion), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/catalog/mt_opt), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/external_tfrecord?hl=en#load_dataset_with_tfds), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/data), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/data_performance#optimize_performance), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#shuffle), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/splits), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/datasets/api_docs/python/tfds/load)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/rlds/blob/main/rlds/examples/rlds_tutorial.ipynb) | 16.03.2022 |
    • Тимчишин Віталій - 1l4XYhrIyS6A), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/-RdOwhmqP5s), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/R13BD8qKeTg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ZkjP5RJLQF4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/J4Wdy0Wc_xQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/mBcLRGuAFUk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YIGtalP1mv0), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Yz5pySyEtsU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/x5zLaWT5KPs), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/yBwpo-L80Mc), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3FW7Lu3i5JvHM8ljYj-zLfQRF3EO8sYv)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/fbeilstein/machine_learning/blob/master/lecture_01_introduction.ipynb) | 02.09.2021 |
    • Malcolm Reynolds - jrae)</li> <li>[Andreas Fidjeland](https://github.com/akfidjeland)</li> <li>[Fabio Viola](https://github.com/fabioviola)</li><details><summary>others</summary><li>[Adrià Puigdomènech](https://github.com/adria-p)</li> <li>[Frederic Besse](https://github.com/fbesse)</li> <li>[Tim Green](http://tfgg.me/)</li> <li>[Sébastien Racanière](https://scholar.google.com/citations?user=o-h0vrQAAAAJ)</li> <li>[Gabriel Barth-Maron](https://github.com/fastturtle)</li> <li>[Diego Casas](https://github.com/diegolascasas)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/sonnet?style=social)](https://github.com/deepmind/sonnet) <ul><li>[blog post](https://www.deepmind.com/blog/open-sourcing-sonnet-a-new-library-for-constructing-neural-networks)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://sonnet.readthedocs.io/en/latest/index.html)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://papers.nips.cc/paper/2016/hash/fb87582825f9d28a8d42c5e5e5e8b23d-Abstract.html)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/checkpoint), [<img src="images/tf.svg" alt="tf" height=20/>](https://www.tensorflow.org/guide/saved_model)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/rlpQjnUvoKw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/deepmind/sonnet/blob/v2/examples/little_gan_on_mnist.ipynb) | 17.04.2020 |
    • Ashish Vaswani - MdPcAAAAJ)</li> <li>[Eugene Brevdo](https://ebrevdo.github.io/)</li> <li>[François Chollet](https://fchollet.com/)</li><details><summary>others</summary><li>[Aidan Gomez](https://gom.ai/)</li> <li>[Stephan Gouws](https://scholar.google.com/citations?user=lLTdYUYAAAAJ)</li> <li>[Llion Jones](https://www.linkedin.com/in/llion-jones-9ab3064b)</li> <li>[Łukasz Kaiser](https://scholar.google.com/citations?user=JWmiQR0AAAAJ)</li> <li>[Nal Kalchbrenner](https://www.nal.ai/)</li> <li>[Niki Parmar](https://github.com/nikiparmar)</li> <li>[Ryan Sepassi](https://ryansepassi.com/)</li> <li>[Noam Shazeer](https://github.com/nshazeer)</li> <li>[Jakob Uszkoreit](https://scholar.google.com/citations?user=mOG0bwsAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/tensorflow/tensor2tensor?style=social)](https://github.com/tensorflow/tensor2tensor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1803.07416), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1812.02825), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.03762), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.03059), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.05137), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1801.09797)</li><li>[blog post](https://ai.googleblog.com/2017/06/accelerating-deep-learning-research.html)</li><li>[data](https://research.fb.com/downloads/babi/)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/tensor2tensor-and-one-model-to-learn-them-all-7ef3f9b61ba4)</li><li>[<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.github.io/tensor2tensor/cloud_mlengine.html), [<img src="images/tf.svg" alt="tf" height=20/>](https://tensorflow.github.io/tensor2tensor/cloud_tpu.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/O2UvKxaOH7c), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/VYQ8n3Besrw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/cS2UZKHq4i4)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/tensor2tensor/blob/master/tensor2tensor/notebooks/Transformer_translate.ipynb) | 14.01.2020 |
    • Curtis Northcutt - badge.php?doi=10.1613/jair.1.12125)](https://doi.org/10.1613/jair.1.12125) [![](https://img.shields.io/github/stars/cleanlab/cleanlab?style=social)](https://github.com/cleanlab/cleanlab) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1911.00068)</li><li>[blog post](https://l7.curtisnorthcutt.com/confident-learning)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.cleanlab.ai/)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/@sujathamudadla1213/cleanlab-python-library-34e0a37720ef)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://cleanlab.ai/slack)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/CleanlabAI)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/BnOTv0f9Msk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/nGye-lrsLRc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/QHaT_AiUljw)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/cleanlab/cleanlab/blob/master/docs/source/tutorials/image.ipynb) | 30.03.2024 |
    • Sourab Mangrulkar - tune-flan-t5-peft)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://huggingface.co/docs/peft)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/microsoft/DeepSpeed/issues/3002)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/ought/raft/viewer/twitter_complaints), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/T0_3B), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/bigscience/mt0-xxl), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/facebook/opt-6.7b), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/roberta-large), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/glue/viewer/mrpc)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YVU5wAA6Txo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Us5ZFp16PaU), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/YKCtbIJC3kQ)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/huggingface/peft/blob/master/examples/int8_training/Finetune_flan_t5_large_bnb_peft.ipynb) | 21.03.2024 |
    • Alex Wiltschko - 467105048) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1406.2572), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1706.04454), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1802.03451), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.07062)</li><li>[book](https://mitpress.mit.edu/sites/default/files/titles/content/sicm_edition_2/book.html), [book](https://mitpress.mit.edu/books/functional-differential-geometry)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/google/jax#auto-vectorization-with-vmap), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/hips/autograd)</li><li>[tutorial](http://videolectures.net/deeplearning2017_johnson_automatic_differentiation/)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Truncated_Newton_method), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Pullback_(differential_geometry)), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Holomorphic_function), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Cauchy%E2%80%93Riemann_equations)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/jax/blob/main/docs/notebooks/autodiff_cookbook.ipynb) | 21.06.2024 |
    • Brian Moore - examples)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/voxel51), [<img src="images/medium.svg" alt="medium" height=20/>](https://towardsdatascience.com/open-source-tools-for-fast-computer-vision-model-building-b39755aab490)</li><li>[<img src="images/slack.svg" alt="slack" height=20/>](https://slack.voxel51.com/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/voxel51)</li><li>[website](https://voxel51.com/fiftyone/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PLuREAXoPgT0SJLKsgFzKxffMApbXp90Gi)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/voxel51/fiftyone-examples/blob/master/examples/quickstart.ipynb) | 27.02.2024 |
    • Jonathan Heek - van-zee/)</li></ul></details> | [![](https://img.shields.io/github/stars/google/flax?style=social)](https://github.com/google/flax) <ul><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://flax.readthedocs.io/)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://github.com/huggingface/transformers/tree/main/examples/flax)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/google-introduces-flax-a-neural-network-library-for-jax-84bdc6f8f160)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/erpdf7/p_flax_a_neural_network_library_for_jax_designed/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/e8StU6WQCqw), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/HOlQzrn84A4), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5eUSmJvK8WA)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google/flax/blob/main/docs/quick_start.ipynb) | 10.01.2024 |
    • Mostafa Dehghani - badge.php?doi=10.1109/CVPR52688.2022.02070)](https://doi.org/10.1109/CVPR52688.2022.02070) [![](https://img.shields.io/github/stars/google-research/scenic?style=social)](https://github.com/google-research/scenic) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2110.11403)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/syncedreview/google-open-sources-scenic-a-jax-library-for-rapid-computer-vision-model-prototyping-and-894dbdeddbae)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/deeplearning/comments/qgyjck/r_google_opensources_scenic_a_jax_library_for/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/google-research/scenic/blob/main/scenic/common_lib/colabs/scenic_playground.ipynb) | 04.05.2022 |
    • Erwin Coumans - 7nkNCfoEKap4z3qadLVj8QB4a), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/9p0O941opGc), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/kZxPaGdoSJY), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL9LUFPiB6N3YrS0O7XM_1sBVWRnSRB643)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bulletphysics/bullet3/blob/master/examples/pybullet/notebooks/HelloPyBullet.ipynb) | 13.10.2020 |
    • awesome-colab-notebooks - colab-notebooks.svg)](https://starchart.cc/amrzv/awesome-colab-notebooks)
    • Chen Chen
    • Ian Osband - lattimore.com/)</li> <li>[Csaba Szepesvari](https://sites.ualberta.ca/~szepesva/)</li> <li>[Satinder Singh](http://web.eecs.umich.edu/~baveja/)</li> <li>[Benjamin Van Roy](https://web.stanford.edu/~bvr/)</li> <li>[Richard Sutton](http://www.incompleteideas.net/)</li> <li>[David Silver](https://www.davidsilver.uk/)</li> <li>[Hado Van Hasselt](https://hadovanhasselt.com/)</li></ul></details> | [![](https://img.shields.io/github/stars/deepmind/bsuite?style=social)](https://github.com/deepmind/bsuite) <ul><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/gym)</li><li>[paper](https://openreview.net/forum?id=rygf-kSYwH)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/Wcv4eU_qtZU)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1rU20zJ281sZuMD1DHbsODFr1DbASL0RH) | 13.02.2021 |
    • Max Woolf - rnn/master/data/tinyshakespeare/input.txt)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.aitextgen.io/)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/task/text-generation)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/15qBZx5y9rdaQSyWpsreMDnTiZ5IlN0zD) | 17.05.2021 |
    • Han Xiao - ai/jina?style=social)](https://github.com/jina-ai/jina) <ul><li>[data](https://sites.google.com/view/totally-looks-like-dataset)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://docs.jina.ai/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/jina-ai/example-grafana-prometheus/blob/main/grafana-dashboards/flow.json)</li><li>[hub](https://hub.jina.ai/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/playlist?list=PL3UBBWOUVhFYRUa_gpYYKBqEAkO4sxmne), [<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/c/jina-ai)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/docs/Using_Jina_on_Colab.ipynb) | 11.06.2022 |
    • Phil Wang - daze?style=social)](https://github.com/lucidrains/deep-daze) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2103.00020), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2006.09661)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/CLIP)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/deepdaze/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1_YOHdORb0Fg1Q7vWZ_KlrtFe9Ur3pmVj) | 17.03.2021 |
    • Hugging Face
    • Robin Rombach - qp)</li> <li>[Patrick Esser](https://github.com/pesser)</li> <li>[Björn Ommer](https://ommer-lab.com/people/ommer/)</li></ul> | [![](https://img.shields.io/github/stars/CompVis/stable-diffusion?style=social)](https://github.com/CompVis/stable-diffusion) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2205.11487), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2207.12598), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2202.09778), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2108.01073)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://arxiv.org/abs/2112.10752), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/christophschuhmann/improved-aesthetic-predictor), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ShieldMnt/invisible-watermark), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/guided-diffusion), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/denoising-diffusion-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/lucidrains/x-transformers)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/CompVis), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion2B-en), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/datasets/laion/laion-high-resolution)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/CompVis/stable-diffusion/blob/main/scripts/latent_imagenet_diffusion.ipynb) | 10.08.2022 |
    • Krzysztof Ostrowski - learning)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/tensorflow/federated/blob/master/docs/tutorials/simulations.ipynb) | 28.06.2024 |
    • Glenn Jocher
    • Google
    • Ben Trevett - networks/)</li><li>[LeNet-5](http://yann.lecun.com/exdb/lenet/)</li><li>[guide](https://adeshpande3.github.io/A-Beginner%27s-Guide-To-Understanding-Convolutional-Neural-Networks/)</li><li>[paper](http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf)</li><li>[<img src="images/pwc.svg" alt="pwc" height=20/>](https://paperswithcode.com/method/lenet)</li><li>[<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Convolution), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Sobel_operator), [<img src="images/wiki.svg" alt="wiki" height=20/>](https://en.wikipedia.org/wiki/Gaussian_blur)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/bentrevett/pytorch-image-classification/blob/master/2_lenet.ipynb) | 26.12.2021 |
    • Roboflow - maestro?style=social)](https://github.com/roboflow/multimodal-maestro) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.11441), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.17421)</li><li>[blog post](https://blog.roboflow.com/multimodal-maestro-advanced-lmm-prompting/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/computervision/comments/186o2b2/multimodal_maestro_prompt_tools_for_use_with_lmms/)</li><li>[website](https://maestro.roboflow.com/)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/roboflow/multimodal-maestro/blob/develop/cookbooks/multimodal_maestro_gpt_4_vision.ipynb) | 30.11.2023 |
    • Mingxing Tan
    • David Bau - Yan Zhu](https://www.cs.cmu.edu/~junyanz/)</li> <li>[Hendrik Strobelt](http://hendrik.strobelt.com/)</li> <li>[Bolei Zhou](https://boleizhou.github.io/)</li><details><summary>others</summary><li>[Joshua Tenenbaum](https://mitibmwatsonailab.mit.edu/people/joshua-tenenbaum/)</li> <li>[William Freeman](https://billf.mit.edu/)</li> <li>[Antonio Torralba](https://groups.csail.mit.edu/vision/torralbalab/)</li></ul></details> | [![](https://img.shields.io/github/stars/CSAILVision/GANDissect?style=social)](https://github.com/CSAILVision/GANDissect) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1811.10597), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1901.09887), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/1807.10221)</li><li>[demo](http://gandissect.res.ibm.com/ganpaint.html)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/CSAILVision/NetDissect), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/junyanz/iGAN)</li><li>[project](https://gandissect.csail.mit.edu/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://www.youtube.com/watch?v=yVCgUYe4JTM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/SIDN-IAP/global-model-repr/blob/master/notebooks/gandissect_solutions.ipynb) | 04.05.2020 |
    • intel - compressor?style=social)](https://github.com/intel/neural-compressor) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.14592), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2309.05516), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2211.07715)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.com/invite/Wxk3J3ZJkU)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://github.com/intel/neural-compressor)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/intel/intel-extension-for-tensorflow), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/intel/intel-extension-for-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lightning-AI/pytorch-lightning/blob/master/docs/source-pytorch/advanced/post_training_quantization.rst)</li><li>[<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/pytorch/pytorch-inference-acceleration-with-intel-neural-compressor-842ef4210d7d), [<img src="images/medium.svg" alt="medium" height=20/>](https://medium.com/intel-analytics-software/efficient-text-classification-with-intel-neural-compressor-4853296deeac)</li><li>[<img src="images/neurips.svg" alt="neurips" height=20/>](https://neurips.cc/virtual/2022/59433)</li><li>[<img src="images/pt.svg" alt="pt" height=20/>](https://pytorch.org/tutorials/recipes/intel_neural_compressor_for_pytorch.html)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/SswQbIHUrvQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/5xHKe4wWLes), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/H7Gg-EmGpAI), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/ie3w_j0Ntsk), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/m2LokuUdeVg), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/38wrDHEQZuM)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/intel/neural-compressor/blob/master/examples/notebook/onnxruntime/Quick_Started_Notebook_of_INC_for_ONNXRuntime.ipynb) | 27.10.2023 |
    • Jinbo Xing - Tsin Wong](https://ttwong12.github.io/myself.html)</li> <li>[Ying Shan](https://scholar.google.com/citations?user=4oXBp9UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2310.12190)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/chaojie/ComfyUI-DynamiCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/VideoCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/YingqingHe/ScaleCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/TaleCrafter), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/AILab-CVC/FreeNoise)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/Doubiiu/DynamiCrafter_1024)</li><li>[project](https://doubiiu.github.io/projects/DynamiCrafter/)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/StableDiffusion/comments/1aj7gcw/dynamicrafter_gets_updated/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://x.com/noguchis/status/1754488826016432341?s=20)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/0NfmIsNAg-g), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/PtW7hjCawbo)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/camenduru/DynamiCrafter-colab/blob/main/DynamiCrafter_colab_576_1024.ipynb) | 12.02.2024 |
    • Bo Peng - anthony.github.io/)</li> <li>[Alon Albalak](https://alon-albalak.github.io/)</li><details><summary>others</summary><li>[Samuel Arcadinho](https://github.com/SSamDav)</li> <li>[Matteo Grella](http://www.matteogrella.com/)</li> <li>[Kranthi Kiran](https://kranthigv.github.io/)</li> <li>[Haowen Hou](https://github.com/howard-hou)</li> <li>[Przemyslaw Kazienko](https://kazienko.eu/en)</li> <li>[Jan Kocon](https://github.com/KoconJan)</li> <li>[Bartlomiej Koptyra](https://github.com/bkoptyra)</li> <li>[Ipsit Mantri](https://ipsitmantri.github.io/)</li> <li>[Ferdinand Mom](https://3outeille.github.io/)</li> <li>[Xiangru Tang](https://github.com/tangxiangru)</li> <li>[Johan Wind](https://johanwind.github.io/)</li> <li>[Stanisław Woźniak](https://www.researchgate.net/profile/Stanislaw-Wozniak-3)</li> <li>[Qihang Zhao](https://www.researchgate.net/profile/Qihang-Zhao-2)</li> <li>[Peng Zhou](https://pengzhou.sites.ucsc.edu/)</li> <li>[Jian Zhu](https://lingjzhu.github.io/)</li> <li>[Rui-Jie Zhu](https://scholar.google.com/citations?user=08ITzJsAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/BlinkDL/RWKV-LM?style=social)](https://github.com/BlinkDL/RWKV-LM) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2305.13048), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2105.14103), [<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2002.05202)</li><li>[data](https://dldata-public.s3.us-east-2.amazonaws.com/simplebooks.zip)</li><li>[demo](https://josephrocca.github.io/rwkv-v4-web/demo/)</li><li>[<img src="images/discord.svg" alt="discord" height=20/>](https://discord.gg/bDSBUMeFpc)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/saharNooby/rwkv.cpp), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/cgisky1980/ai00_rwkv_server), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/harrisonvanderbyl/rwkv-cpp-cuda), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Blealtan/RWKV-LM-LoRA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/TheRamU/Fay/blob/main/README_EN.md), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/ridgerchu/SpikeGPT), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/RWKV-v2-RNN-Pile/tree/main/RWKV-v3), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/SmallInitEmb), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/RWKV-CUDA), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/BlinkDL/minGPT-tuned)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/BlinkDL), [<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/BlinkDL/clip-guided-binary-autoencoder)</li><li>[<img src="images/reddit.svg" alt="reddit" height=20/>](https://www.reddit.com/r/MachineLearning/comments/umq908/r_rwkvv2rnn_a_parallelizable_rnn_with/)</li><li>[<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/BlinkDL_AI), [<img src="images/twitter.svg" alt="twitter" height=20/>](https://twitter.com/HochreiterSepp/status/1524270961314484227)</li><li>[website](https://www.rwkv.com/)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/x8pW19wKfXQ), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/B3Qa2rRsaXo), [<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/w-xydM6C6Qc)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1F7tZoPZaWJf1fsCmZ5tjw6sYHiFOYVWM) | 21.09.2022 |
    • Antonin Raffin - RM/rl-baselines3-zoo?style=social)](https://github.com/DLR-RM/rl-baselines3-zoo) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2005.05719)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://stable-baselines3.readthedocs.io/en/master/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/DLR-RM/rl-baselines3-zoo), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/openai/roboschool), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Farama-Foundation/Minigrid)</li><li>[<img src="images/hf.svg" alt="hf" height=20/>](https://huggingface.co/sb3)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/github/Stable-Baselines-Team/rl-colab-notebooks/blob/sb3/rl-baselines-zoo.ipynb) | 14.04.2023 |
    • Xintao Wang - ntu.com/person/ccloy/)</li> <li>[Chao Dong](https://scholar.google.com/citations?user=OSDCB0UAAAAJ)</li></ul></details> | [![](https://img.shields.io/github/stars/XPixelGroup/BasicSR?style=social)](https://github.com/XPixelGroup/BasicSR) <ul><li>[<img src="images/arxiv.svg" alt="arxiv" height=20/>](https://arxiv.org/abs/2012.02181)</li><li>[<img src="images/docs.svg" alt="docs" height=20/>](https://basicsr.readthedocs.io/en/latest/)</li><li>[<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/ESRGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xindongzhang/ECBSR), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/Lotayou/Face-Renovation), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/csxmli2016/DFDNet), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/rosinality/stylegan2-pytorch), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/facexlib), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyView), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyFigure), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/SFTGAN), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/DNI), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyCrawler), [<img src="images/git.svg" alt="git" height=20/>](https://github.com/xinntao/HandyWriting)</li><li>[<img src="images/yt.svg" alt="yt" height=20/>](https://youtu.be/KaMYsxWkmww)</li></ul> | [![Open In Colab](images/colab.svg)](https://colab.research.google.com/drive/1JQScYICvEC3VqaabLu-lxvq9h7kSV1ML) | 07.06.2021 |
    • ![ - alpha/PixArt-sigma)</li> <li>Zejun-Yang/AniPortrait [![](https://img.shields.io/github/stars/Zejun-Yang/AniPortrait?style=social)](https://github.com/Zejun-Yang/AniPortrait)</li> <li>saic-mdal/HiDT [![](https://img.shields.io/github/stars/saic-mdal/HiDT?style=social)](https://github.com/saic-mdal/HiDT)</li> <li>showlab/Show-1 [![](https://img.shields.io/github/stars/showlab/Show-1?style=social)](https://github.com/showlab/Show-1)</li> <li>Artiprocher/DiffSynth-Studio [![](https://img.shields.io/github/stars/Artiprocher/DiffSynth-Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio)</li> <li>ai-forever/sage [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage)</li> <li>piddnad/DDColor [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor)</li> <li>drengskapur/colab2pdf [![](https://img.shields.io/github/stars/drengskapur/colab2pdf?style=social)](https://github.com/drengskapur/colab2pdf)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>QwenLM/Qwen-VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>microsoft/xr-development-for-beginners [![](https://img.shields.io/github/stars/microsoft/xr-development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>maszhongming/Multi-LoRA-Composition [![](https://img.shields.io/github/stars/maszhongming/Multi-LoRA-Composition?style=social)](https://github.com/maszhongming/Multi-LoRA-Composition)</li> <li>Algolzw/daclip-uir [![](https://img.shields.io/github/stars/Algolzw/daclip-uir?style=social)](https://github.com/Algolzw/daclip-uir)</li> <li>Vahe1994/AQLM [![](https://img.shields.io/github/stars/Vahe1994/AQLM?style=social)](https://github.com/Vahe1994/AQLM)</li> <li>OpenTalker/SadTalker [![](https://img.shields.io/github/stars/OpenTalker/SadTalker?style=social)](https://github.com/OpenTalker/SadTalker)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>google-research/rlds [![](https://img.shields.io/github/stars/google-research/rlds?style=social)](https://github.com/google-research/rlds)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>roboflow/supervision [![](https://img.shields.io/github/stars/roboflow/supervision?style=social)](https://github.com/roboflow/supervision)</li> <li>alex04072000/NeRViS [![](https://img.shields.io/github/stars/alex04072000/NeRViS?style=social)](https://github.com/alex04072000/NeRViS)</li> <li>RVC-Project/Retrieval-based-Voice-Conversion-WebUI [![](https://img.shields.io/github/stars/RVC-Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>wenquanlu/HandRefiner [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner)</li> <li>cleanlab/cleanlab [![](https://img.shields.io/github/stars/cleanlab/cleanlab?style=social)](https://github.com/cleanlab/cleanlab)</li> <li>lollcat/fab-torch [![](https://img.shields.io/github/stars/lollcat/fab-torch?style=social)](https://github.com/lollcat/fab-torch)</li> <li>lllyasviel/Fooocus [![](https://img.shields.io/github/stars/lllyasviel/Fooocus?style=social)](https://github.com/lllyasviel/Fooocus)</li> <li>OpenTalker/video-retalking [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking)</li></ul> | <ul><li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>GraphCast [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>SkyAR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717)</li> <li>AudioLM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>AlphaPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784)</li> <li>RITM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897365)](https://doi.org/10.1109/ICIP46576.2022.9897365)</li> <li>PyTorchVideo [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3474085.3478329)](https://doi.org/10.1145/3474085.3478329)</li> <li>VToonify [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555437)](https://doi.org/10.1145/3550454.3555437)</li> <li>AvatarCLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>Omnivore [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01563)](https://doi.org/10.1109/CVPR52688.2022.01563)</li> <li>py-irt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>DualStyleGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00754)](https://doi.org/10.1109/CVPR52688.2022.00754)</li> <li>SAHI [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>LaSAFT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li> <li>StyleGAN-Human [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>Background Matting V2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.00865)](https://doi.org/10.1109/CVPR46437.2021.00865)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>StyleGAN-NADA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530164)](https://doi.org/10.1145/3528223.3530164)</li> <li>AnimeGANv2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18)</li></ul> |
    • ![ - Studio)</li> <li>PixArt-alpha/PixArt-sigma [![](https://img.shields.io/github/stars/PixArt-alpha/PixArt-sigma?style=social)](https://github.com/PixArt-alpha/PixArt-sigma)</li> <li>lllyasviel/IC-Light [![](https://img.shields.io/github/stars/lllyasviel/IC-Light?style=social)](https://github.com/lllyasviel/IC-Light)</li> <li>jxnl/instructor [![](https://img.shields.io/github/stars/jxnl/instructor?style=social)](https://github.com/jxnl/instructor)</li> <li>roboflow/supervision [![](https://img.shields.io/github/stars/roboflow/supervision?style=social)](https://github.com/roboflow/supervision)</li> <li>voxel51/fiftyone [![](https://img.shields.io/github/stars/voxel51/fiftyone?style=social)](https://github.com/voxel51/fiftyone)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>HVision-NKU/StoryDiffusion [![](https://img.shields.io/github/stars/HVision-NKU/StoryDiffusion?style=social)](https://github.com/HVision-NKU/StoryDiffusion)</li> <li>metavoiceio/metavoice-src [![](https://img.shields.io/github/stars/metavoiceio/metavoice-src?style=social)](https://github.com/metavoiceio/metavoice-src)</li> <li>NVIDIA/TensorRT [![](https://img.shields.io/github/stars/NVIDIA/TensorRT?style=social)](https://github.com/NVIDIA/TensorRT)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>tincans-ai/gazelle [![](https://img.shields.io/github/stars/tincans-ai/gazelle?style=social)](https://github.com/tincans-ai/gazelle)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>google-research/rlds [![](https://img.shields.io/github/stars/google-research/rlds?style=social)](https://github.com/google-research/rlds)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>QwenLM/Qwen-VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>guoyww/animatediff [![](https://img.shields.io/github/stars/guoyww/animatediff?style=social)](https://github.com/guoyww/animatediff)</li> <li>ai-forever/sage [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage)</li> <li>Zejun-Yang/AniPortrait [![](https://img.shields.io/github/stars/Zejun-Yang/AniPortrait?style=social)](https://github.com/Zejun-Yang/AniPortrait)</li> <li>damian0815/compel [![](https://img.shields.io/github/stars/damian0815/compel?style=social)](https://github.com/damian0815/compel)</li> <li>NVIDIA/NeMo [![](https://img.shields.io/github/stars/NVIDIA/NeMo?style=social)](https://github.com/NVIDIA/NeMo)</li> <li>coqui-ai/TTS [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>autodistill/autodistill [![](https://img.shields.io/github/stars/autodistill/autodistill?style=social)](https://github.com/autodistill/autodistill)</li> <li>piddnad/DDColor [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor)</li> <li>wenquanlu/HandRefiner [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner)</li> <li>infer-actively/pymdp [![](https://img.shields.io/github/stars/infer-actively/pymdp?style=social)](https://github.com/infer-actively/pymdp)</li> <li>vwxyzjn/cleanrl [![](https://img.shields.io/github/stars/vwxyzjn/cleanrl?style=social)](https://github.com/vwxyzjn/cleanrl)</li> <li>RVC-Project/Retrieval-based-Voice-Conversion-WebUI [![](https://img.shields.io/github/stars/RVC-Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI)</li> <li>google-deepmind/tapnet [![](https://img.shields.io/github/stars/google-deepmind/tapnet?style=social)](https://github.com/google-deepmind/tapnet)</li></ul> | <ul><li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>SkyAR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>AudioLM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409)</li> <li>GraphCast [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>ECON [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057)</li> <li>AlphaPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>StyleGAN-Human [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1)</li> <li>SAHI [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990)</li> <li>RVM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/WACV51458.2022.00319)](https://doi.org/10.1109/WACV51458.2022.00319)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>AnimeGANv2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>Dream Fields [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094)</li> <li>Omnivore [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01563)](https://doi.org/10.1109/CVPR52688.2022.01563)</li> <li>AvatarCLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>Stylized Neural Painting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01543)](https://doi.org/10.1109/CVPR46437.2021.01543)</li> <li>StylEx [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00073)](https://doi.org/10.1109/ICCV48922.2021.00073)</li> <li>HiDT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00751)](https://doi.org/10.1109/CVPR42600.2020.00751)</li> <li>encoder4editing [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3450626.3459838)](https://doi.org/10.1145/3450626.3459838)</li> <li>PIFuHD [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00016)](https://doi.org/10.1109/CVPR42600.2020.00016)</li> <li>HyperStyle [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01796)](https://doi.org/10.1109/CVPR52688.2022.01796)</li> <li>Nerfies [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00581)](https://doi.org/10.1109/ICCV48922.2021.00581)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>FGVC [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42)</li></ul> |
    • ![ - torch)</li> <li>Camb-ai/MARS5-TTS [![](https://img.shields.io/github/stars/Camb-ai/MARS5-TTS?style=social)](https://github.com/Camb-ai/MARS5-TTS)</li> <li>Vahe1994/AQLM [![](https://img.shields.io/github/stars/Vahe1994/AQLM?style=social)](https://github.com/Vahe1994/AQLM)</li> <li>Artiprocher/DiffSynth-Studio [![](https://img.shields.io/github/stars/Artiprocher/DiffSynth-Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>drengskapur/colab2pdf [![](https://img.shields.io/github/stars/drengskapur/colab2pdf?style=social)](https://github.com/drengskapur/colab2pdf)</li> <li>lllyasviel/IC-Light [![](https://img.shields.io/github/stars/lllyasviel/IC-Light?style=social)](https://github.com/lllyasviel/IC-Light)</li> <li>TencentARC/InstantMesh [![](https://img.shields.io/github/stars/TencentARC/InstantMesh?style=social)](https://github.com/TencentARC/InstantMesh)</li> <li>ToTheBeginning/PuLID [![](https://img.shields.io/github/stars/ToTheBeginning/PuLID?style=social)](https://github.com/ToTheBeginning/PuLID)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>naklecha/llama3-from-scratch [![](https://img.shields.io/github/stars/naklecha/llama3-from-scratch?style=social)](https://github.com/naklecha/llama3-from-scratch)</li> <li>PixArt-alpha/PixArt-sigma [![](https://img.shields.io/github/stars/PixArt-alpha/PixArt-sigma?style=social)](https://github.com/PixArt-alpha/PixArt-sigma)</li> <li>jxnl/instructor [![](https://img.shields.io/github/stars/jxnl/instructor?style=social)](https://github.com/jxnl/instructor)</li> <li>roboflow/supervision [![](https://img.shields.io/github/stars/roboflow/supervision?style=social)](https://github.com/roboflow/supervision)</li> <li>google-deepmind/tapnet [![](https://img.shields.io/github/stars/google-deepmind/tapnet?style=social)](https://github.com/google-deepmind/tapnet)</li> <li>QwenLM/Qwen-VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>ToonCrafter/ToonCrafter [![](https://img.shields.io/github/stars/ToonCrafter/ToonCrafter?style=social)](https://github.com/ToonCrafter/ToonCrafter)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>RVC-Project/Retrieval-based-Voice-Conversion-WebUI [![](https://img.shields.io/github/stars/RVC-Project/Retrieval-based-Voice-Conversion-WebUI?style=social)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI)</li></ul> | <ul><li>FollowYourPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1609/aaai.v38i5.28206)](https://doi.org/10.1609/aaai.v38i5.28206)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>AudioLM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>Swin2SR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-25063-7_42)](https://doi.org/10.1007/978-3-031-25063-7_42)</li> <li>ECON [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057)</li> <li>GraphCast [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336)</li> <li>LaSAFT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>VRT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2024.3372454)](https://doi.org/10.1109/TIP.2024.3372454)</li> <li>PyTerrier [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3459637.3482013)](https://doi.org/10.1145/3459637.3482013)</li> <li>DualStyleGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00754)](https://doi.org/10.1109/CVPR52688.2022.00754)</li> <li>AlphaPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784)</li> <li>SAHI [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990)</li> <li>Motion Supervised co-part Segmentation [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>SimSwap [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3394171.3413630)](https://doi.org/10.1145/3394171.3413630)</li> <li>Lifespan Age Transformation Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58539-6_44)](https://doi.org/10.1007/978-3-030-58539-6_44)</li></ul> |
    • ![ - VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>wilson1yan/VideoGPT [![](https://img.shields.io/github/stars/wilson1yan/VideoGPT?style=social)](https://github.com/wilson1yan/VideoGPT)</li> <li>mistralai/mistral-src [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>facebookresearch/av_hubert [![](https://img.shields.io/github/stars/facebookresearch/av_hubert?style=social)](https://github.com/facebookresearch/av_hubert)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>google-deepmind/tapnet [![](https://img.shields.io/github/stars/google-deepmind/tapnet?style=social)](https://github.com/google-deepmind/tapnet)</li> <li>lollcat/fab-torch [![](https://img.shields.io/github/stars/lollcat/fab-torch?style=social)](https://github.com/lollcat/fab-torch)</li> <li>guoyww/animatediff [![](https://img.shields.io/github/stars/guoyww/animatediff?style=social)](https://github.com/guoyww/animatediff)</li> <li>autodistill/autodistill [![](https://img.shields.io/github/stars/autodistill/autodistill?style=social)](https://github.com/autodistill/autodistill)</li> <li>wenquanlu/HandRefiner [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner)</li> <li>damian0815/compel [![](https://img.shields.io/github/stars/damian0815/compel?style=social)](https://github.com/damian0815/compel)</li> <li>microsoft/lida [![](https://img.shields.io/github/stars/microsoft/lida?style=social)](https://github.com/microsoft/lida)</li> <li>lllyasviel/Fooocus [![](https://img.shields.io/github/stars/lllyasviel/Fooocus?style=social)](https://github.com/lllyasviel/Fooocus)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>ai-forever/sage [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage)</li> <li>yangxy/PASD [![](https://img.shields.io/github/stars/yangxy/PASD?style=social)](https://github.com/yangxy/PASD)</li> <li>VincentStimper/normalizing-flows [![](https://img.shields.io/github/stars/VincentStimper/normalizing-flows?style=social)](https://github.com/VincentStimper/normalizing-flows)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>microsoft/xr-development-for-beginners [![](https://img.shields.io/github/stars/microsoft/xr-development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners)</li> <li>OpenTalker/video-retalking [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking)</li> <li>threestudio-project/threestudio [![](https://img.shields.io/github/stars/threestudio-project/threestudio?style=social)](https://github.com/threestudio-project/threestudio)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>ai-forever/ghost [![](https://img.shields.io/github/stars/ai-forever/ghost?style=social)](https://github.com/ai-forever/ghost)</li> <li>ai-forever/sber-swap [![](https://img.shields.io/github/stars/ai-forever/sber-swap?style=social)](https://github.com/ai-forever/sber-swap)</li> <li>rohitgandikota/sliders [![](https://img.shields.io/github/stars/rohitgandikota/sliders?style=social)](https://github.com/rohitgandikota/sliders)</li></ul> | <ul><li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>Motion Supervised co-part Segmentation [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>StyleGAN-Human [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1)</li> <li>MSG-Net [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-11018-5_32)](https://doi.org/10.1007/978-3-030-11018-5_32)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>LaSAFT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896)</li> <li>AudioLM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>SAHI [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990)</li> <li>AnimeGANv2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18)</li> <li>Cleanlab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1613/jair.1.12125)](https://doi.org/10.1613/jair.1.12125)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li> <li>HyperStyle [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01796)](https://doi.org/10.1109/CVPR52688.2022.01796)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>Stylized Neural Painting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01543)](https://doi.org/10.1109/CVPR46437.2021.01543)</li> <li>Dream Fields [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094)</li> <li>Skillful Precipitation Nowcasting Using Deep Generative Models of Radar [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03854-z)](https://doi.org/10.1038/s41586-021-03854-z)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>AlphaFold [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li> <li>FGVC [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42)</li> <li>StyleGAN 2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.00813)](https://doi.org/10.1109/CVPR42600.2020.00813)</li> <li>Open-Unmix [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.21105/joss.01667)](https://doi.org/10.21105/joss.01667)</li> <li>AvatarCLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094)</li></ul> |
    • ![ - from-scratch [![](https://img.shields.io/github/stars/naklecha/llama3-from-scratch?style=social)](https://github.com/naklecha/llama3-from-scratch)</li> <li>unslothai/unsloth [![](https://img.shields.io/github/stars/unslothai/unsloth?style=social)](https://github.com/unslothai/unsloth)</li> <li>Vahe1994/AQLM [![](https://img.shields.io/github/stars/Vahe1994/AQLM?style=social)](https://github.com/Vahe1994/AQLM)</li> <li>lllyasviel/IC-Light [![](https://img.shields.io/github/stars/lllyasviel/IC-Light?style=social)](https://github.com/lllyasviel/IC-Light)</li> <li>jxnl/instructor [![](https://img.shields.io/github/stars/jxnl/instructor?style=social)](https://github.com/jxnl/instructor)</li> <li>microsoft/torchgeo [![](https://img.shields.io/github/stars/microsoft/torchgeo?style=social)](https://github.com/microsoft/torchgeo)</li> <li>google-research/rlds [![](https://img.shields.io/github/stars/google-research/rlds?style=social)](https://github.com/google-research/rlds)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>HongwenZhang/PyMAF-X [![](https://img.shields.io/github/stars/HongwenZhang/PyMAF-X?style=social)](https://github.com/HongwenZhang/PyMAF-X)</li> <li>ai-forever/sage [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>Camb-ai/MARS5-TTS [![](https://img.shields.io/github/stars/Camb-ai/MARS5-TTS?style=social)](https://github.com/Camb-ai/MARS5-TTS)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>TencentARC/InstantMesh [![](https://img.shields.io/github/stars/TencentARC/InstantMesh?style=social)](https://github.com/TencentARC/InstantMesh)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>autodistill/autodistill [![](https://img.shields.io/github/stars/autodistill/autodistill?style=social)](https://github.com/autodistill/autodistill)</li> <li>vwxyzjn/cleanrl [![](https://img.shields.io/github/stars/vwxyzjn/cleanrl?style=social)](https://github.com/vwxyzjn/cleanrl)</li> <li>tincans-ai/gazelle [![](https://img.shields.io/github/stars/tincans-ai/gazelle?style=social)](https://github.com/tincans-ai/gazelle)</li></ul> | <ul><li>FateZero [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.01460)](https://doi.org/10.1109/ICCV51070.2023.01460)</li> <li>GraphCast [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336)</li> <li>VRT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2024.3372454)](https://doi.org/10.1109/TIP.2024.3372454)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>Panini-Net [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1609/aaai.v36i3.20159)](https://doi.org/10.1609/aaai.v36i3.20159)</li> <li>py-irt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>Thin-Plate Spline Motion Model [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00364)](https://doi.org/10.1109/CVPR52688.2022.00364)</li> <li>PyMAF-X [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2023.3271691)](https://doi.org/10.1109/TPAMI.2023.3271691)</li> <li>AlphaPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2022.3222784)](https://doi.org/10.1109/TPAMI.2022.3222784)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>Swin2SR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-25063-7_42)](https://doi.org/10.1007/978-3-031-25063-7_42)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>Instant-NGP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530127)](https://doi.org/10.1145/3528223.3530127)</li> <li>ECON [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057)</li> <li>NeRViS [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00230)](https://doi.org/10.1109/ICCV48922.2021.00230)</li> <li>Scenic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.02070)](https://doi.org/10.1109/CVPR52688.2022.02070)</li></ul> |
    • ![ - VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>mistralai/mistral-src [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>lollcat/fab-torch [![](https://img.shields.io/github/stars/lollcat/fab-torch?style=social)](https://github.com/lollcat/fab-torch)</li> <li>ai-forever/sage [![](https://img.shields.io/github/stars/ai-forever/sage?style=social)](https://github.com/ai-forever/sage)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>autodistill/autodistill [![](https://img.shields.io/github/stars/autodistill/autodistill?style=social)](https://github.com/autodistill/autodistill)</li> <li>lllyasviel/Fooocus [![](https://img.shields.io/github/stars/lllyasviel/Fooocus?style=social)](https://github.com/lllyasviel/Fooocus)</li> <li>intel/intel-extension-for-transformers [![](https://img.shields.io/github/stars/intel/intel-extension-for-transformers?style=social)](https://github.com/intel/intel-extension-for-transformers)</li> <li>guoyww/animatediff [![](https://img.shields.io/github/stars/guoyww/animatediff?style=social)](https://github.com/guoyww/animatediff)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>facebookresearch/co-tracker [![](https://img.shields.io/github/stars/facebookresearch/co-tracker?style=social)](https://github.com/facebookresearch/co-tracker)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>OpenTalker/video-retalking [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking)</li> <li>microsoft/xr-development-for-beginners [![](https://img.shields.io/github/stars/microsoft/xr-development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners)</li> <li>nerfstudio-project/nerfstudio [![](https://img.shields.io/github/stars/nerfstudio-project/nerfstudio?style=social)](https://github.com/nerfstudio-project/nerfstudio)</li> <li>threestudio-project/threestudio [![](https://img.shields.io/github/stars/threestudio-project/threestudio?style=social)](https://github.com/threestudio-project/threestudio)</li> <li>damian0815/compel [![](https://img.shields.io/github/stars/damian0815/compel?style=social)](https://github.com/damian0815/compel)</li> <li>dreamgaussian/dreamgaussian [![](https://img.shields.io/github/stars/dreamgaussian/dreamgaussian?style=social)](https://github.com/dreamgaussian/dreamgaussian)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>facebookresearch/home-robot [![](https://img.shields.io/github/stars/facebookresearch/home-robot?style=social)](https://github.com/facebookresearch/home-robot)</li> <li>Algolzw/daclip-uir [![](https://img.shields.io/github/stars/Algolzw/daclip-uir?style=social)](https://github.com/Algolzw/daclip-uir)</li> <li>dome272/wuerstchen [![](https://img.shields.io/github/stars/dome272/wuerstchen?style=social)](https://github.com/dome272/wuerstchen)</li> <li>coqui-ai/TTS [![](https://img.shields.io/github/stars/coqui-ai/TTS?style=social)](https://github.com/coqui-ai/TTS)</li> <li>huggingface/trl [![](https://img.shields.io/github/stars/huggingface/trl?style=social)](https://github.com/huggingface/trl)</li></ul> | <ul><li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>AudioLM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TASLP.2023.3288409)](https://doi.org/10.1109/TASLP.2023.3288409)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>VToonify [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555437)](https://doi.org/10.1145/3550454.3555437)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>StyleGAN-Human [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_1)](https://doi.org/10.1007/978-3-031-19787-1_1)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>Dream Fields [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>AvatarCLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3528223.3530094)](https://doi.org/10.1145/3528223.3530094)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01202)](https://doi.org/10.1109/CVPR46437.2021.01202)</li> <li>DualStyleGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00754)](https://doi.org/10.1109/CVPR52688.2022.00754)</li> <li>GHOST [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ACCESS.2022.3196668)](https://doi.org/10.1109/ACCESS.2022.3196668)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>Omnivore [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01563)](https://doi.org/10.1109/CVPR52688.2022.01563)</li> <li>SAHI [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICIP46576.2022.9897990)](https://doi.org/10.1109/ICIP46576.2022.9897990)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li> <li>SkyAR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li></ul> |
    • ![ - Studio [![](https://img.shields.io/github/stars/Artiprocher/DiffSynth-Studio?style=social)](https://github.com/Artiprocher/DiffSynth-Studio)</li> <li>wilson1yan/VideoGPT [![](https://img.shields.io/github/stars/wilson1yan/VideoGPT?style=social)](https://github.com/wilson1yan/VideoGPT)</li> <li>horseee/DeepCache [![](https://img.shields.io/github/stars/horseee/DeepCache?style=social)](https://github.com/horseee/DeepCache)</li> <li>metavoiceio/metavoice-src [![](https://img.shields.io/github/stars/metavoiceio/metavoice-src?style=social)](https://github.com/metavoiceio/metavoice-src)</li> <li>QwenLM/Qwen-VL [![](https://img.shields.io/github/stars/QwenLM/Qwen-VL?style=social)](https://github.com/QwenLM/Qwen-VL)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>piddnad/DDColor [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor)</li> <li>google-research/big_vision [![](https://img.shields.io/github/stars/google-research/big_vision?style=social)](https://github.com/google-research/big_vision)</li> <li>microsoft/xr-development-for-beginners [![](https://img.shields.io/github/stars/microsoft/xr-development-for-beginners?style=social)](https://github.com/microsoft/xr-development-for-beginners)</li> <li>OpenTalker/video-retalking [![](https://img.shields.io/github/stars/OpenTalker/video-retalking?style=social)](https://github.com/OpenTalker/video-retalking)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>nd-ball/py-irt [![](https://img.shields.io/github/stars/nd-ball/py-irt?style=social)](https://github.com/nd-ball/py-irt)</li> <li>drengskapur/colab2pdf [![](https://img.shields.io/github/stars/drengskapur/colab2pdf?style=social)](https://github.com/drengskapur/colab2pdf)</li> <li>graphdeco-inria/gaussian-splatting [![](https://img.shields.io/github/stars/graphdeco-inria/gaussian-splatting?style=social)](https://github.com/graphdeco-inria/gaussian-splatting)</li> <li>haotian-liu/LLaVA [![](https://img.shields.io/github/stars/haotian-liu/LLaVA?style=social)](https://github.com/haotian-liu/LLaVA)</li> <li>microsoft/autogen [![](https://img.shields.io/github/stars/microsoft/autogen?style=social)](https://github.com/microsoft/autogen)</li> <li>ShishirPatil/gorilla [![](https://img.shields.io/github/stars/ShishirPatil/gorilla?style=social)](https://github.com/ShishirPatil/gorilla)</li> <li>wenquanlu/HandRefiner [![](https://img.shields.io/github/stars/wenquanlu/HandRefiner?style=social)](https://github.com/wenquanlu/HandRefiner)</li> <li>facebookresearch/home-robot [![](https://img.shields.io/github/stars/facebookresearch/home-robot?style=social)](https://github.com/facebookresearch/home-robot)</li> <li>guoyww/animatediff [![](https://img.shields.io/github/stars/guoyww/animatediff?style=social)](https://github.com/guoyww/animatediff)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>mistralai/mistral-src [![](https://img.shields.io/github/stars/mistralai/mistral-src?style=social)](https://github.com/mistralai/mistral-src)</li> <li>TencentARC/PhotoMaker [![](https://img.shields.io/github/stars/TencentARC/PhotoMaker?style=social)](https://github.com/TencentARC/PhotoMaker)</li> <li>Stability-AI/StableCascade [![](https://img.shields.io/github/stars/Stability-AI/StableCascade?style=social)](https://github.com/Stability-AI/StableCascade)</li> <li>Algolzw/daclip-uir [![](https://img.shields.io/github/stars/Algolzw/daclip-uir?style=social)](https://github.com/Algolzw/daclip-uir)</li> <li>rohitgandikota/sliders [![](https://img.shields.io/github/stars/rohitgandikota/sliders?style=social)](https://github.com/rohitgandikota/sliders)</li> <li>sczhou/CodeFormer [![](https://img.shields.io/github/stars/sczhou/CodeFormer?style=social)](https://github.com/sczhou/CodeFormer)</li> <li>damian0815/compel [![](https://img.shields.io/github/stars/damian0815/compel?style=social)](https://github.com/damian0815/compel)</li></ul> | <ul><li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>SkyAR [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2022.3192717)](https://doi.org/10.1109/TIP.2022.3192717)</li> <li>DFL-Colab [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1016/j.patcog.2023.109628)](https://doi.org/10.1016/j.patcog.2023.109628)</li> <li>GHOST [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ACCESS.2022.3196668)](https://doi.org/10.1109/ACCESS.2022.3196668)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>py-irt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>PyTerrier [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3459637.3482013)](https://doi.org/10.1145/3459637.3482013)</li> <li>ByteTrack [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20047-2_1)](https://doi.org/10.1007/978-3-031-20047-2_1)</li> <li>Motion Supervised co-part Segmentation [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICPR48806.2021.9412520)](https://doi.org/10.1109/ICPR48806.2021.9412520)</li> <li>VToonify [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555437)](https://doi.org/10.1145/3550454.3555437)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>FILM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_15)](https://doi.org/10.1007/978-3-031-20071-7_15)</li> <li>Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01202)](https://doi.org/10.1109/CVPR46437.2021.01202)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>LaSAFT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICASSP39728.2021.9413896)](https://doi.org/10.1109/ICASSP39728.2021.9413896)</li> <li>ConvNeXt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01167)](https://doi.org/10.1109/CVPR52688.2022.01167)</li> <li>Mask2Former [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00135)](https://doi.org/10.1109/CVPR52688.2022.00135)</li> <li>SimSwap [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3394171.3413630)](https://doi.org/10.1145/3394171.3413630)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>MSG-Net [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-11018-5_32)](https://doi.org/10.1007/978-3-030-11018-5_32)</li> <li>Geometry-Free View Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.01409)](https://doi.org/10.1109/ICCV48922.2021.01409)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>AnimeGANv2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-981-15-5577-0_18)](https://doi.org/10.1007/978-981-15-5577-0_18)</li> <li>deep-significance [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/p19-1266)](https://doi.org/10.18653/v1/p19-1266)</li> <li>FGVC [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-030-58610-2_42)](https://doi.org/10.1007/978-3-030-58610-2_42)</li> <li>AlphaTensor [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-022-05172-4)](https://doi.org/10.1038/s41586-022-05172-4)</li> <li>AlphaFold [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1038/s41586-021-03819-2)](https://doi.org/10.1038/s41586-021-03819-2)</li> <li>Taming Transformers for High-Resolution Image Synthesis [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR46437.2021.01268)](https://doi.org/10.1109/CVPR46437.2021.01268)</li> <li>NeRViS [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV48922.2021.00230)](https://doi.org/10.1109/ICCV48922.2021.00230)</li> <li>EfficientDet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR42600.2020.01079)](https://doi.org/10.1109/CVPR42600.2020.01079)</li></ul> |
    • ![ - tracker)</li> <li>iterative/datachain [![](https://img.shields.io/github/stars/iterative/datachain?style=social)](https://github.com/iterative/datachain)</li> <li>callummcdougall/ARENA_3.0 [![](https://img.shields.io/github/stars/callummcdougall/ARENA_3.0?style=social)](https://github.com/callummcdougall/ARENA_3.0)</li> <li>ToTheBeginning/PuLID [![](https://img.shields.io/github/stars/ToTheBeginning/PuLID?style=social)](https://github.com/ToTheBeginning/PuLID)</li> <li>ZhengPeng7/BiRefNet [![](https://img.shields.io/github/stars/ZhengPeng7/BiRefNet?style=social)](https://github.com/ZhengPeng7/BiRefNet)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>unslothai/unsloth [![](https://img.shields.io/github/stars/unslothai/unsloth?style=social)](https://github.com/unslothai/unsloth)</li> <li>facebookresearch/segment-anything-2 [![](https://img.shields.io/github/stars/facebookresearch/segment-anything-2?style=social)](https://github.com/facebookresearch/segment-anything-2)</li> <li>lllyasviel/IC-Light [![](https://img.shields.io/github/stars/lllyasviel/IC-Light?style=social)](https://github.com/lllyasviel/IC-Light)</li> <li>gemelo-ai/vocos [![](https://img.shields.io/github/stars/gemelo-ai/vocos?style=social)](https://github.com/gemelo-ai/vocos)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>TransformerLensOrg/TransformerLens [![](https://img.shields.io/github/stars/TransformerLensOrg/TransformerLens?style=social)](https://github.com/TransformerLensOrg/TransformerLens)</li> <li>HongwenZhang/PyMAF-X [![](https://img.shields.io/github/stars/HongwenZhang/PyMAF-X?style=social)](https://github.com/HongwenZhang/PyMAF-X)</li> <li>roboflow/supervision [![](https://img.shields.io/github/stars/roboflow/supervision?style=social)](https://github.com/roboflow/supervision)</li> <li>KwaiVGI/LivePortrait [![](https://img.shields.io/github/stars/KwaiVGI/LivePortrait?style=social)](https://github.com/KwaiVGI/LivePortrait)</li> <li>piddnad/DDColor [![](https://img.shields.io/github/stars/piddnad/DDColor?style=social)](https://github.com/piddnad/DDColor)</li> <li>TencentARC/InstantMesh [![](https://img.shields.io/github/stars/TencentARC/InstantMesh?style=social)](https://github.com/TencentARC/InstantMesh)</li> <li>LAION-AI/aesthetic-predictor [![](https://img.shields.io/github/stars/LAION-AI/aesthetic-predictor?style=social)](https://github.com/LAION-AI/aesthetic-predictor)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>facebookresearch/home-robot [![](https://img.shields.io/github/stars/facebookresearch/home-robot?style=social)](https://github.com/facebookresearch/home-robot)</li> <li>KillianLucas/open-interpreter [![](https://img.shields.io/github/stars/KillianLucas/open-interpreter?style=social)](https://github.com/KillianLucas/open-interpreter)</li> <li>jxnl/instructor [![](https://img.shields.io/github/stars/jxnl/instructor?style=social)](https://github.com/jxnl/instructor)</li></ul> | <ul><li>LIDA [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2023.acl-demo.11)](https://doi.org/10.18653/v1/2023.acl-demo.11)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>Tune-A-Video [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.00701)](https://doi.org/10.1109/ICCV51070.2023.00701)</li> <li>FollowYourPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1609/aaai.v38i5.28206)](https://doi.org/10.1609/aaai.v38i5.28206)</li> <li>Text2Video-Zero [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.01462)](https://doi.org/10.1109/ICCV51070.2023.01462)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>UniFormerV2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.00157)](https://doi.org/10.1109/ICCV51070.2023.00157)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>Dream Fields [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>GraphCast [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1126/science.adi2336)](https://doi.org/10.1126/science.adi2336)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>VRT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2024.3372454)](https://doi.org/10.1109/TIP.2024.3372454)</li> <li>Thin-Plate Spline Motion Model [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00364)](https://doi.org/10.1109/CVPR52688.2022.00364)</li> <li>PyMAF-X [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2023.3271691)](https://doi.org/10.1109/TPAMI.2023.3271691)</li> <li>FateZero [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.01460)](https://doi.org/10.1109/ICCV51070.2023.01460)</li> <li>py-irt [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.18653/v1/2021.acl-long.346)](https://doi.org/10.18653/v1/2021.acl-long.346)</li> <li>VQ-Diffusion [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01043)](https://doi.org/10.1109/CVPR52688.2022.01043)</li> <li>ECON [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057)</li></ul> |
    • ![ - mdal/HiDT [![](https://img.shields.io/github/stars/saic-mdal/HiDT?style=social)](https://github.com/saic-mdal/HiDT)</li> <li>roboflow/multimodal-maestro [![](https://img.shields.io/github/stars/roboflow/multimodal-maestro?style=social)](https://github.com/roboflow/multimodal-maestro)</li> <li>ZhengPeng7/BiRefNet [![](https://img.shields.io/github/stars/ZhengPeng7/BiRefNet?style=social)](https://github.com/ZhengPeng7/BiRefNet)</li> <li>roboflow/supervision [![](https://img.shields.io/github/stars/roboflow/supervision?style=social)](https://github.com/roboflow/supervision)</li> <li>rohitgandikota/sliders [![](https://img.shields.io/github/stars/rohitgandikota/sliders?style=social)](https://github.com/rohitgandikota/sliders)</li> <li>tincans-ai/gazelle [![](https://img.shields.io/github/stars/tincans-ai/gazelle?style=social)](https://github.com/tincans-ai/gazelle)</li> <li>iterative/datachain [![](https://img.shields.io/github/stars/iterative/datachain?style=social)](https://github.com/iterative/datachain)</li> <li>alex04072000/NeRViS [![](https://img.shields.io/github/stars/alex04072000/NeRViS?style=social)](https://github.com/alex04072000/NeRViS)</li> <li>facebookresearch/segment-anything-2 [![](https://img.shields.io/github/stars/facebookresearch/segment-anything-2?style=social)](https://github.com/facebookresearch/segment-anything-2)</li> <li>unslothai/unsloth [![](https://img.shields.io/github/stars/unslothai/unsloth?style=social)](https://github.com/unslothai/unsloth)</li> <li>callummcdougall/ARENA_3.0 [![](https://img.shields.io/github/stars/callummcdougall/ARENA_3.0?style=social)](https://github.com/callummcdougall/ARENA_3.0)</li> <li>KwaiVGI/LivePortrait [![](https://img.shields.io/github/stars/KwaiVGI/LivePortrait?style=social)](https://github.com/KwaiVGI/LivePortrait)</li> <li>voxel51/fiftyone [![](https://img.shields.io/github/stars/voxel51/fiftyone?style=social)](https://github.com/voxel51/fiftyone)</li> <li>comfyanonymous/ComfyUI [![](https://img.shields.io/github/stars/comfyanonymous/ComfyUI?style=social)](https://github.com/comfyanonymous/ComfyUI)</li> <li>microsoft/torchgeo [![](https://img.shields.io/github/stars/microsoft/torchgeo?style=social)](https://github.com/microsoft/torchgeo)</li> <li>THU-MIG/yolov10 [![](https://img.shields.io/github/stars/THU-MIG/yolov10?style=social)](https://github.com/THU-MIG/yolov10)</li> <li>IDEA-Research/GroundingDINO [![](https://img.shields.io/github/stars/IDEA-Research/GroundingDINO?style=social)](https://github.com/IDEA-Research/GroundingDINO)</li> <li>TencentARC/InstantMesh [![](https://img.shields.io/github/stars/TencentARC/InstantMesh?style=social)](https://github.com/TencentARC/InstantMesh)</li> <li>Doubiiu/DynamiCrafter [![](https://img.shields.io/github/stars/Doubiiu/DynamiCrafter?style=social)](https://github.com/Doubiiu/DynamiCrafter)</li> <li>ultralytics/ultralytics [![](https://img.shields.io/github/stars/ultralytics/ultralytics?style=social)](https://github.com/ultralytics/ultralytics)</li> <li>EleutherAI/lm-evaluation-harness [![](https://img.shields.io/github/stars/EleutherAI/lm-evaluation-harness?style=social)](https://github.com/EleutherAI/lm-evaluation-harness)</li></ul> | <ul><li>FateZero [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.01460)](https://doi.org/10.1109/ICCV51070.2023.01460)</li> <li>Text2Video-Zero [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.01462)](https://doi.org/10.1109/ICCV51070.2023.01462)</li> <li>Tune-A-Video [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.00701)](https://doi.org/10.1109/ICCV51070.2023.00701)</li> <li>UniFormerV2 [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/ICCV51070.2023.00157)](https://doi.org/10.1109/ICCV51070.2023.00157)</li> <li>FollowYourPose [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1609/aaai.v38i5.28206)](https://doi.org/10.1609/aaai.v38i5.28206)</li> <li>ECON [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00057)](https://doi.org/10.1109/CVPR52729.2023.00057)</li> <li>VRT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TIP.2024.3372454)](https://doi.org/10.1109/TIP.2024.3372454)</li> <li>Gaussian Splatting [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3592433)](https://doi.org/10.1145/3592433)</li> <li>PyMAF-X [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/TPAMI.2023.3271691)](https://doi.org/10.1109/TPAMI.2023.3271691)</li> <li>OWL-ViT [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20080-9_42)](https://doi.org/10.1007/978-3-031-20080-9_42)</li> <li>DragGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3588432.3591500)](https://doi.org/10.1145/3588432.3591500)</li> <li>SadTalker [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52729.2023.00836)](https://doi.org/10.1109/CVPR52729.2023.00836)</li> <li>Thin-Plate Spline Motion Model [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00364)](https://doi.org/10.1109/CVPR52688.2022.00364)</li> <li>VideoReTalking [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550469.3555399)](https://doi.org/10.1145/3550469.3555399)</li> <li>Detic [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20077-9_21)](https://doi.org/10.1007/978-3-031-20077-9_21)</li> <li>Dream Fields [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.00094)](https://doi.org/10.1109/CVPR52688.2022.00094)</li> <li>JoJoGAN [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-19787-1_8)](https://doi.org/10.1007/978-3-031-19787-1_8)</li> <li>LDM [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01042)](https://doi.org/10.1109/CVPR52688.2022.01042)</li> <li>GLIP [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01069)](https://doi.org/10.1109/CVPR52688.2022.01069)</li> <li>NAFNet [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1007/978-3-031-20071-7_2)](https://doi.org/10.1007/978-3-031-20071-7_2)</li> <li>IDE-3D [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1145/3550454.3555506)](https://doi.org/10.1145/3550454.3555506)</li> <li>StyleSDF [![](https://api.juleskreuer.eu/citation-badge.php?doi=10.1109/CVPR52688.2022.01314)](https://doi.org/10.1109/CVPR52688.2022.01314)</li></ul> |