Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/LINs-lab/DynMoE
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
https://github.com/LINs-lab/DynMoE
adaptive-computation dynamic-neural-network language-model mixture-of-experts moe multimodal-large-language-models
Last synced: 25 days ago
JSON representation
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
- Host: GitHub
- URL: https://github.com/LINs-lab/DynMoE
- Owner: LINs-lab
- License: apache-2.0
- Created: 2024-05-17T08:25:31.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-08-21T07:18:45.000Z (4 months ago)
- Last Synced: 2024-08-21T08:38:22.709Z (4 months ago)
- Topics: adaptive-computation, dynamic-neural-network, language-model, mixture-of-experts, moe, multimodal-large-language-models
- Language: Python
- Homepage: https://arxiv.org/abs/2405.14297
- Size: 57.3 MB
- Stars: 36
- Watchers: 11
- Forks: 8
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-mixture-of-experts - 23 May 2024