Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-mixture-of-experts
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
https://github.com/superbrucejia/awesome-mixture-of-experts
Last synced: 4 days ago
JSON representation
-
Survey
-
Sparse Gating Mechanism
-
Auxiliary Load Balance Loss
-
Mixtures of Experts Architecture
-
Parameter-efficient Fine-tuning
-
Mutual Information Loss
-
Expert Capacity Limit
-
Non-trainable Gating Mechanism
-
Expert-choice Gating
-
From Dense to Sparse
-
-
Foundational Work
-
Dense Gating Mechanism
-
From Dense to Sparse
-
-
Soft Gating Mechanism
Categories
Sub Categories
Mixtures of Experts Architecture
23
Parameter-efficient Fine-tuning
18
Auxiliary Load Balance Loss
12
From Dense to Sparse
9
Non-trainable Gating Mechanism
8
Mutual Information Loss
6
Expert Merging
4
Expert Capacity Limit
3
Expert-choice Gating
2
Hierarchical Mixtures of Experts for the EM Algorithm
2
Token Merging
2
Sparse-Gated Mixture of Experts in Transformer
1