Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/NVIDIA/Megatron-LM
Ongoing research training transformer models at scale
https://github.com/NVIDIA/Megatron-LM
large-language-models model-para transformers
Last synced: 3 months ago
JSON representation
Ongoing research training transformer models at scale
- Host: GitHub
- URL: https://github.com/NVIDIA/Megatron-LM
- Owner: NVIDIA
- License: other
- Created: 2019-03-21T16:15:52.000Z (almost 6 years ago)
- Default Branch: main
- Last Pushed: 2024-05-20T08:21:00.000Z (8 months ago)
- Last Synced: 2024-05-22T01:11:52.406Z (8 months ago)
- Topics: large-language-models, model-para, transformers
- Language: Python
- Homepage: https://docs.nvidia.com/megatron-core/developer-guide/latest/user-guide/index.html#quick-start
- Size: 8.13 MB
- Stars: 8,839
- Watchers: 155
- Forks: 1,985
- Open Issues: 425
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Codeowners: CODEOWNERS
Awesome Lists containing this project
- awesome-local-ai - Megatron-LM - Ongoing research training transformer models at scale. (Training)
- awesome-bert - NVIDIA/Megatron-LM
- Awesome_Multimodel_LLM - Megatron-LM - Ongoing research training transformer models at scale. (LLM Training Frameworks)
- awesome-distributed-ml - Megatron-LM: Ongoing Research Training Transformer Models at Scale
- awesome-llm - Megatron-LM - 进行大规模Transformer模型训练的研究框架。 (LLM训练框架 / LLM 评估工具)
- awesome-llm - Megatron-LM - 进行大规模Transformer模型训练的研究框架。 (LLM训练框架 / LLM 评估工具)
- awesome-lm-system - Megatron-LM
- awesome-transformer-nlp - NVIDIA/Megatron-LM - Ongoing research training transformer language models at scale, including: BERT. (Transformer Implementations By Communities / PyTorch)
- awesome-repositories - NVIDIA/Megatron-LM - Ongoing research training transformer models at scale (Python)
- StarryDivineSky - NVIDIA/Megatron-LM
- Awesome-LLM - Megatron-LM - Ongoing research training transformer models at scale. (LLM Training Frameworks)
- Awesome-LLM-Compression - [Code
- project-awesome - NVIDIA/Megatron-LM - Ongoing research training transformer models at scale (Python)
- awesome-open-source-lms - Megatron-LM
- awesome-production-machine-learning - Megatron-LM - LM.svg?style=social) - Megatron-LM is a highly optimized and efficient library for training large language models. (Industry Strength NLP)
- AiTreasureBox - NVIDIA/Megatron-LM - 01-19_11125_3](https://img.shields.io/github/stars/NVIDIA/Megatron-LM.svg)|Ongoing research training transformer models at scale| (Repos)
- awesome-ai-papers - [Megatron-LM - DeepSpeed](https://github.com/microsoft/Megatron-DeepSpeed)\]\[[Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed)\]\[[Pai-Megatron-Patch](https://github.com/alibaba/Pai-Megatron-Patch)\] (NLP / 3. Pretraining)
- awesome-ai-papers - [Megatron-LM - DeepSpeed](https://github.com/microsoft/Megatron-DeepSpeed)\]\[[Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed)\]\[[Pai-Megatron-Patch](https://github.com/alibaba/Pai-Megatron-Patch)\] (NLP / 3. Pretraining)