Turing-NLG

5 Posts

Yoav Shoham
Turing-NLG

Yoav Shoham: Language models that reason

I believe that natural language processing in 2022 will re-embrace symbolic reasoning, harmonizing it with the statistical operation of modern neural networks. Let me explain what I mean by this.
Illustration of giant Christmas tree in a town plaza
Turing-NLG

Trillions of Parameters: Are AI models with trillions of parameters the new normal?

The trend toward ever-larger models crossed the threshold from immense to ginormous. Google kicked off 2021 with Switch Transformer, the first published work to exceed a trillion parameters, weighing in at 1.6 trillion.
Graphs with data related to Microsoft's library DeepSpeed
Turing-NLG

Toward 1 Trillion Parameters: Microsoft upgrades its DeepSpeed optimization library.

An open source library could spawn trillion-parameter neural networks and help small-time developers build big-league models. Microsoft upgraded DeepSpeed, a library that accelerates the PyTorch deep learning framework.
Talking bubbles inside talking bubbles
Turing-NLG

Bigger is Better: A research summary of Microsoft's Turing-NLG language model.

Natural language processing lately has come to resemble an arms race, as the big AI companies build models that encompass ever larger numbers of parameters. Microsoft recently held the record — but not for long.
Generative BST example and graph
Turing-NLG

Big Bot Makes Small Talk: A research summary of Facebook's Generative BST chatbot

Facebook recently rolled out its entry in the World’s Biggest Chatbot sweepstakes. In keeping with the company’s social-networking dominance, the bot is designed to excel at chitchat on any subject.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox