Should We Still Pretrain Encoders with Masked Language Modeling? Paper • 2507.00994 • Published 22 days ago • 74 • 9