Zero-Shot Classification
Transformers
PyTorch
Safetensors
bert
text-classification
saattrupdan commited on
Commit
e78dbcd
·
verified ·
1 Parent(s): 88db4bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -5
README.md CHANGED
@@ -34,11 +34,12 @@ This model is a fine-tuned version of [NbAiLab/nb-bert-base](https://huggingface
34
 
35
  We have released three models for Scandinavian NLI, of different sizes:
36
 
 
37
  - [alexandrainst/scandi-nli-large](https://huggingface.co/alexandrainst/scandi-nli-large)
38
  - alexandrainst/scandi-nli-base (this)
39
  - [alexandrainst/scandi-nli-small](https://huggingface.co/alexandrainst/scandi-nli-small)
40
 
41
- A demo of the large model can be found in [this Hugging Face Space](https://huggingface.co/spaces/alexandrainst/zero-shot-classification) - check it out!
42
 
43
  The performance and model size of each of them can be found in the Performance section below.
44
 
@@ -80,7 +81,8 @@ The Scandinavian scores are the average of the Danish, Swedish and Norwegian sco
80
 
81
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
82
  | :-------- | :------------ | :--------- | :----------- | :----------- |
83
- | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | **73.70%** | **74.44%** | **83.91%** | 354M |
 
84
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 69.01% | 71.99% | 80.66% | 279M |
85
  | `alexandrainst/scandi-nli-base` (this) | 67.42% | 71.54% | 80.09% | 178M |
86
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 64.17% | 70.80% | 77.29% | 560M |
@@ -97,7 +99,8 @@ The test split is generated using [this gist](https://gist.github.com/saattrupda
97
 
98
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
99
  | :-------- | :------------ | :--------- | :----------- | :----------- |
100
- | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | **73.80%** | **58.41%** | **86.98%** | 354M |
 
101
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 68.37% | 57.10% | 83.25% | 279M |
102
  | `alexandrainst/scandi-nli-base` (this) | 62.44% | 55.00% | 80.42% | 178M |
103
  | [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 56.92% | 53.25% | 76.39% | 178M |
@@ -114,7 +117,8 @@ We acknowledge that not evaluating on a gold standard dataset is not ideal, but
114
 
115
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
116
  | :-------- | :------------ | :--------- | :----------- | :----------- |
117
- | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | **76.69%** | **84.47%** | **84.38%** | 354M |
 
118
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 75.35% | 83.42% | 83.55% | 560M |
119
  | [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 73.84% | 82.46% | 82.58% | 279M |
120
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 73.32% | 82.15% | 82.08% | 279M |
@@ -131,7 +135,8 @@ We acknowledge that not evaluating on a gold standard dataset is not ideal, but
131
 
132
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
133
  | :-------- | :------------ | :--------- | :----------- | :----------- |
134
- | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | **70.61%** | **80.43%** | **80.36%** | 354M |
 
135
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 67.99% | 78.68% | 78.60% | 560M |
136
  | `alexandrainst/scandi-nli-base` (this) | 67.53% | 78.24% | 78.33% | 178M |
137
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 65.33% | 76.73% | 76.65% | 279M |
 
34
 
35
  We have released three models for Scandinavian NLI, of different sizes:
36
 
37
+ - [alexandrainst/scandi-nli-large-v2](https://huggingface.co/alexandrainst/scandi-nli-large-v2)
38
  - [alexandrainst/scandi-nli-large](https://huggingface.co/alexandrainst/scandi-nli-large)
39
  - alexandrainst/scandi-nli-base (this)
40
  - [alexandrainst/scandi-nli-small](https://huggingface.co/alexandrainst/scandi-nli-small)
41
 
42
+ A demo of the large-v2 model can be found in [this Hugging Face Space](https://huggingface.co/spaces/alexandrainst/zero-shot-classification) - check it out!
43
 
44
  The performance and model size of each of them can be found in the Performance section below.
45
 
 
81
 
82
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
83
  | :-------- | :------------ | :--------- | :----------- | :----------- |
84
+ | [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **75.42%** | **75.41%** | **84.95%** | 354M |
85
+ | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | 73.70% | 74.44% | 83.91% | 354M |
86
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 69.01% | 71.99% | 80.66% | 279M |
87
  | `alexandrainst/scandi-nli-base` (this) | 67.42% | 71.54% | 80.09% | 178M |
88
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 64.17% | 70.80% | 77.29% | 560M |
 
99
 
100
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
101
  | :-------- | :------------ | :--------- | :----------- | :----------- |
102
+ | [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **75.65%** | **59.23%** | **87.89%** | 354M |
103
+ | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | 73.80% | 58.41% | 86.98% | 354M |
104
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 68.37% | 57.10% | 83.25% | 279M |
105
  | `alexandrainst/scandi-nli-base` (this) | 62.44% | 55.00% | 80.42% | 178M |
106
  | [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 56.92% | 53.25% | 76.39% | 178M |
 
117
 
118
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
119
  | :-------- | :------------ | :--------- | :----------- | :----------- |
120
+ | [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **79.02%** | **85.99%** | **85.99%** | 354M |
121
+ | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | 76.69% | 84.47% | 84.38% | 354M |
122
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 75.35% | 83.42% | 83.55% | 560M |
123
  | [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 73.84% | 82.46% | 82.58% | 279M |
124
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 73.32% | 82.15% | 82.08% | 279M |
 
135
 
136
  | **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
137
  | :-------- | :------------ | :--------- | :----------- | :----------- |
138
+ | [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **71.59%** | **81.00%** | **80.96%** | 354M |
139
+ | [`alexandrainst/scandi-nli-large`](https://huggingface.co/alexandrainst/scandi-nli-large) | 70.61% | 80.43% | 80.36% | 354M |
140
  | [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 67.99% | 78.68% | 78.60% | 560M |
141
  | `alexandrainst/scandi-nli-base` (this) | 67.53% | 78.24% | 78.33% | 178M |
142
  | [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 65.33% | 76.73% | 76.65% | 279M |