DeepSpeed
Toward 1 Trillion Parameters: Microsoft upgrades its DeepSpeed optimization library.
An open source library could spawn trillion-parameter neural networks and help small-time developers build big-league models. Microsoft upgraded DeepSpeed, a library that accelerates the PyTorch deep learning framework.