Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://github.com/microsoft/unilm
beit beit-3 deepnet document-ai foundation-models kosmos kosmos-1 layoutlm layoutxlm llm minilm mllm multimodal nlp pre-trained-model textdiffuser trocr unilm xlm-e
Last synced: 3 months ago
JSON representation
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
- Host: GitHub
- URL: https://github.com/microsoft/unilm
- Owner: microsoft
- License: mit
- Created: 2019-07-23T04:15:28.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2024-03-20T17:15:43.000Z (3 months ago)
- Last Synced: 2024-03-25T18:30:46.198Z (3 months ago)
- Topics: beit, beit-3, deepnet, document-ai, foundation-models, kosmos, kosmos-1, layoutlm, layoutxlm, llm, minilm, mllm, multimodal, nlp, pre-trained-model, textdiffuser, trocr, unilm, xlm-e
- Language: Python
- Homepage: https://aka.ms/GeneralAI
- Size: 57.5 MB
- Stars: 18,032
- Watchers: 303
- Forks: 2,333
- Open Issues: 549
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Security: SECURITY.md
Lists
- Awesome-Multimodal-Research - 08/2022 - 3](https://arxiv.org/abs/2208.10442) is a general-purpose multimodal foundation model, which achieves state-of-the-art transfer performance on both vision and vision-language tasks. https://github.com/microsoft/unilm/tree/master/beit* (News)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- Awesome-MIM - unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities. (Related Project / Project of Self-supervised Learning)
- awesome-vision-language-pretraining - unilm
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-ChatGPT-repositories - unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (NLP)
- awesome-self-supervised-vision - unilm
- awesome-machine-learning-resources - **[Library
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- my-awesome - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- my-awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-Multi-Document-Summarization - microsoft/unilm
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-lists - uniml
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities | microsoft | 16927 | (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- my-awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities | microsoft | 18883 | (Python)
- my-awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- awesome-stars - unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities | microsoft | 18970 | (Python)
- AiTreasureBox - microsoft/unilm - 06-12_18911_2](https://img.shields.io/github/stars/microsoft/unilm.svg)|Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities| (Repos)
- awesome-stars - unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities | microsoft | 18962 | (Python)
- awesome-stars - microsoft/unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)
- StarryDivineSky - microsoft/unilm - NLP及更高版本的统一语言模型预训练 (预训练模型)
- awesome-stars - microsoft/unilm - `★18952` Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities (Python)