MoritzLaurer HF staff commited on
Commit
c7db107
·
1 Parent(s): b2921d6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -115,10 +115,10 @@ Note that multilingual NLI models are capable of classifying NLI texts without r
115
  in the specific language (cross-lingual transfer). This means that the model is also able of doing NLI on
116
  the other languages it was training on, but performance is most likely lower than for those languages available in XNLI.
117
 
118
- The average XNLI performance of multilingual-MiniLM-L6 reported in the paper is 0.68 ([see table 11](https://arxiv.org/pdf/2002.10957.pdf)).
119
- This reimplementation has an average performance of 0.713.
120
  This increase in performance is probably thanks to the addition of MNLI in the training data and this model was distilled from
121
- XLM-RoBERTa-large instead of -base (multilingual-MiniLM-L6-v2).
122
 
123
 
124
 
 
115
  in the specific language (cross-lingual transfer). This means that the model is also able of doing NLI on
116
  the other languages it was training on, but performance is most likely lower than for those languages available in XNLI.
117
 
118
+ The average XNLI performance of multilingual-MiniLM-L12 reported in the paper is 0.711 ([see table 11](https://arxiv.org/pdf/2002.10957.pdf)).
119
+ This reimplementation has an average performance of 0.75.
120
  This increase in performance is probably thanks to the addition of MNLI in the training data and this model was distilled from
121
+ XLM-RoBERTa-large instead of -base (multilingual-MiniLM-L12-v2).
122
 
123
 
124