LiLT + XLM-RoBERTa-base

This model is created by combining the Language-Independent Layout Transformer (LiLT) with XLM-RoBERTa, a multilingual RoBERTa model (trained on 100 languages).

This way, we have a LayoutLM-like model for 100 languages :)

Downloads last month
93,659
Safetensors
Model size
284M params
Tensor type
I64
Β·
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for nielsr/lilt-xlm-roberta-base

Finetunes
42 models

Spaces using nielsr/lilt-xlm-roberta-base 7