https://github.com/vinsmokesomya/mixture-of-idiotic-experts
🧠✍️🎭 Mixture of Idiotic Experts: A PyTorch-based Sparse Mixture of Experts (MoE) model for generating Shakespeare-like text, character by character. Inspired by Andrej Karpathy's makemore.
https://github.com/vinsmokesomya/mixture-of-idiotic-experts
character-level-lm deep-learning generative-ai mixture-of-experts moe nlp python pytorch
Last synced: 3 months ago
JSON representation
🧠✍️🎭 Mixture of Idiotic Experts: A PyTorch-based Sparse Mixture of Experts (MoE) model for generating Shakespeare-like text, character by character. Inspired by Andrej Karpathy's makemore.
- Host: GitHub
- URL: https://github.com/vinsmokesomya/mixture-of-idiotic-experts
- Owner: VinsmokeSomya
- Created: 2025-05-25T13:10:59.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-05-25T13:40:39.000Z (5 months ago)
- Last Synced: 2025-06-06T01:07:33.470Z (4 months ago)
- Topics: character-level-lm, deep-learning, generative-ai, mixture-of-experts, moe, nlp, python, pytorch
- Language: Python
- Homepage:
- Size: 3.01 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md