https://github.com/the-swarm-corporation/clustermoe
A novel neural network architecture that extends Mixture of Experts (MoE) with hierarchical expert clustering, dynamic tree-based routing, and advanced reliability tracking for improved scalability, specialization, and robustness.
https://github.com/the-swarm-corporation/clustermoe
ai attention llms moe pytorch pytorch-models transformers
Last synced: 3 months ago
JSON representation
A novel neural network architecture that extends Mixture of Experts (MoE) with hierarchical expert clustering, dynamic tree-based routing, and advanced reliability tracking for improved scalability, specialization, and robustness.
- Host: GitHub
- URL: https://github.com/the-swarm-corporation/clustermoe
- Owner: The-Swarm-Corporation
- License: mit
- Created: 2025-06-28T17:41:34.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-06-28T17:54:43.000Z (3 months ago)
- Last Synced: 2025-06-28T18:10:18.674Z (3 months ago)
- Topics: ai, attention, llms, moe, pytorch, pytorch-models, transformers
- Language: Python
- Homepage: https://swarms.ai
- Size: 0 Bytes
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 4