Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/LINs-lab/DynMoE
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
https://github.com/LINs-lab/DynMoE
adaptive-computation language-model mixture-of-experts moe multimodal-large-language-models vision-transformer
Last synced: 3 months ago
JSON representation
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
- Host: GitHub
- URL: https://github.com/LINs-lab/DynMoE
- Owner: LINs-lab
- License: apache-2.0
- Created: 2024-05-17T08:25:31.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-06-24T08:52:58.000Z (4 months ago)
- Last Synced: 2024-06-28T17:30:07.770Z (4 months ago)
- Topics: adaptive-computation, language-model, mixture-of-experts, moe, multimodal-large-language-models, vision-transformer
- Language: Python
- Homepage: https://arxiv.org/abs/2405.14297
- Size: 57.3 MB
- Stars: 27
- Watchers: 8
- Forks: 6
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-mixture-of-experts - 23 May 2024