Difference between revisions of "Megatron-Core"
Jump to navigation
Jump to search
(Created page with "wikipedia:Megatron-Core {{nvidia}}") |
|||
Line 1: | Line 1: | ||
− | [[wikipedia:Megatron-Core]] | + | [[wikipedia:Megatron-Core]] is a self contained, light weight [[PyTorch]] library that packages everything essential for training [[large scale transformer]]. |
+ | |||
+ | * {{Pytorch}} | ||
+ | * {{Transformer}} | ||
{{nvidia}} | {{nvidia}} |
Latest revision as of 09:54, 3 October 2024
wikipedia:Megatron-Core is a self contained, light weight PyTorch library that packages everything essential for training large scale transformer.
Nvidia, GPU, Nvidia tools, nvidia-smi
, CUDA, Nvidia Drive, Tegra, EVGA Corporation, A10, A100, H100, T4, L4, K80, gpustat
, Nvidia Xavier, NVML, TOPS, Nvidia broadcast, Mellanox, Jensen Huang, NVIDIA driver, Nvidia RTX, Tensor Cores, Nvidia DGX, Nvidia Omniverse Cloud, Drive Thor, Ada, Hopper, NVIDIA device plugin for Kubernetes, NVIDIA DCGM, NVIDIA GPU Operator, Megatron-Core, NVLM, NVIDIA GPU Boost, NVSwitch, NVIDIA Driver R450+
Advertising: