Attention, Please! Revisiting Attentive Probing for Masked Image Modeling Paper • 2506.10178 • Published Jun 11 • 8
Keep It SimPool: Who Said Supervised Transformers Suffer from Attention Deficit? Paper • 2309.06891 • Published Sep 13, 2023 • 2