Microsoft released a PyTorch compatible, open-source library called DeepSpeed, that vastly advances large model training by improving scale, speed, cost, and usability, unlocking the ability to train 100-billion-parameter models. — In a blog post published earlier this month, Microsoft announced an open-source library called DeepSpeed, a deep learning optimization library that makes distributed training easy, efficient, and effective.