xlmr_synset_classifier
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5821
- Accuracy: 0.8300
- F1: 0.8189
- Precision: 0.8299
- Recall: 0.8300
- F1 Macro: 0.6291
- Precision Macro: 0.6111
- Recall Macro: 0.6637
- F1 Micro: 0.8300
- Precision Micro: 0.8300
- Recall Micro: 0.8300
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | F1 Macro | Precision Macro | Recall Macro | F1 Micro | Precision Micro | Recall Micro |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3.6302 | 0.6221 | 100 | 2.3997 | 0.4612 | 0.3541 | 0.3218 | 0.4612 | 0.1136 | 0.1245 | 0.1308 | 0.4612 | 0.4612 | 0.4612 |
1.6212 | 1.2442 | 200 | 0.9750 | 0.7479 | 0.7052 | 0.7046 | 0.7479 | 0.4218 | 0.4214 | 0.4650 | 0.7479 | 0.7479 | 0.7479 |
0.9307 | 1.8663 | 300 | 0.7650 | 0.7936 | 0.7685 | 0.7863 | 0.7936 | 0.5217 | 0.5204 | 0.5619 | 0.7936 | 0.7936 | 0.7936 |
0.6977 | 2.4883 | 400 | 0.6956 | 0.8089 | 0.7935 | 0.8090 | 0.8089 | 0.5696 | 0.5599 | 0.6015 | 0.8089 | 0.8089 | 0.8089 |
0.6152 | 3.1104 | 500 | 0.6451 | 0.8188 | 0.8051 | 0.8224 | 0.8188 | 0.6021 | 0.5949 | 0.6321 | 0.8188 | 0.8188 | 0.8188 |
0.5171 | 3.7325 | 600 | 0.5960 | 0.8331 | 0.8209 | 0.8322 | 0.8331 | 0.6287 | 0.6304 | 0.6524 | 0.8331 | 0.8331 | 0.8331 |
0.4772 | 4.3546 | 700 | 0.5903 | 0.8286 | 0.8178 | 0.8291 | 0.8286 | 0.6305 | 0.6244 | 0.6587 | 0.8286 | 0.8286 | 0.8286 |
0.437 | 4.9767 | 800 | 0.5821 | 0.8300 | 0.8189 | 0.8299 | 0.8300 | 0.6291 | 0.6111 | 0.6637 | 0.8300 | 0.8300 | 0.8300 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.20.3
- Downloads last month
- 13
Model tree for kugler/xlmr-large-AmDi-synset-classifier
Base model
FacebookAI/xlm-roberta-large