Pale Transformer
Attention to Rows and Columns: Altering Transformers' Self-Attention Mechanism for Greater Efficiency
A new approach alters transformers' self-attention mechanism to balance computational efficiency with performance on vision tasks.
1 Post
Stay updated with weekly AI News and Insights delivered to your inbox