Chinese National Engineering Laboratory
Attention to Rows and Columns: Altering Transformers' Self-Attention Mechanism for Greater Efficiency
A new approach alters transformers' self-attention mechanism to balance computational efficiency with performance on vision tasks.