Is Attention Better Than Matrix Decomposition?
As an essential ingredient of modern deep learning, attention mechanism, especially self-attention, plays a vital role in the global correlation discovery. However, is hand-crafted attention irreplaceable when modeling the global context? Our …
Integrated Low Rank Based Discriminative Feature Learning for Recognition
We present a supervised low-rank-based approach for learning discriminative features.