#
billion-parameters
Here is 1 public repository matching this topic...
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
machine-learningcompressiondeep-learninggpuinferencepytorchzerodata-parallelismmodel-parallelismmixture-of-expertspipeline-parallelismbillion-parameterstrillion-parameters
- Updated
Oct 13, 2025 - Python
Improve this page
Add a description, image, and links to thebillion-parameters topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with thebillion-parameters topic, visit your repo's landing page and select "manage topics."