GPT-2
Trillions of Parameters: Are AI models with trillions of parameters the new normal?
The trend toward ever-larger models crossed the threshold from immense to ginormous. Google kicked off 2021 with Switch Transformer, the first published work to exceed a trillion parameters, weighing in at 1.6 trillion.