Perceptron
Perceptrons Are All You Need: Google Brain's Multi-Layer Perceptron Rivals Transformers
The paper that introduced the transformer famously declared, “Attention is all you need.” To the contrary, new work shows you may not need transformer-style attention at all.What’s new: Hanxiao Liu and colleagues at Google