Popular repositories Loading
-
Megatron-LM
Megatron-LM PublicForked from lhb8125/Megatron-LM
Ongoing research training transformer models at scale
Python
-
TransformerEngine
TransformerEngine PublicForked from NVIDIA/TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper, Ada and Blackwell GPUs, to provide better performance with lower memory…
Python
-
Megatron-MoE-ModelZoo
Megatron-MoE-ModelZoo PublicForked from yanring/Megatron-MoE-ModelZoo
Best practices for training DeepSeek, Mixtral, Qwen and other MoE models using Megatron Core.
Python
-
Megatron-Bridge
Megatron-Bridge PublicForked from NVIDIA-NeMo/Megatron-Bridge
Training library for Megatron-based models
Python
-
DeepEP
DeepEP PublicForked from deepseek-ai/DeepEP
DeepEP: an efficient expert-parallel communication library
Cuda
If the problem persists, check the GitHub status page or contact support.

