https://github.com/skyworkai/moe-plus-plus
[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
https://github.com/skyworkai/moe-plus-plus
large-language-models llms mixture-of-experts moe
Last synced: 4 months ago
JSON representation
[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
- Host: GitHub
- URL: https://github.com/skyworkai/moe-plus-plus
- Owner: SkyworkAI
- License: apache-2.0
- Created: 2024-10-08T07:49:40.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2024-10-16T06:21:31.000Z (12 months ago)
- Last Synced: 2025-06-17T15:51:12.717Z (4 months ago)
- Topics: large-language-models, llms, mixture-of-experts, moe
- Language: Python
- Homepage: https://arxiv.org/abs/2410.07348
- Size: 1.94 MB
- Stars: 226
- Watchers: 2
- Forks: 9
- Open Issues: 1