ZERO Lab
Home
News
People
Publications
Contact
attention models
Is Attention Better Than Matrix Decomposition?
As an essential ingredient of modern deep learning, attention mechanism, especially self-attention, plays a vital role in the global correlation discovery. However, is hand-crafted attention irreplaceable when modeling the global context? Our …
Cite
×