MrBERT: Modern Multilingual Encoders via Vocabulary, Domain, and Dimensional Adaptation
The paper introduces MrBERT, a family of efficient, open-source multilingual encoders built on the ModernBERT architecture that achieves state-of-the-art performance in specific languages and specialized domains while leveraging Matryoshka Representation Learning to reduce inference and storage costs.