xlm-roberta-large_massive_crf_v1
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 5.3967
- Slot P: 0.7375
- Slot R: 0.7801
- Slot F1: 0.7582
- Slot Exact Match: 0.7260
- Intent Acc: 0.8687
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 256
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Slot P | Slot R | Slot F1 | Slot Exact Match | Intent Acc |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 45 | 19.9807 | 0.0 | 0.0 | 0.0 | 0.3187 | 0.0649 |
84.6375 | 2.0 | 90 | 9.4999 | 0.5191 | 0.5 | 0.5094 | 0.4634 | 0.3453 |
26.6202 | 3.0 | 135 | 4.9217 | 0.6025 | 0.7060 | 0.6502 | 0.6119 | 0.7708 |
11.9124 | 4.0 | 180 | 4.0265 | 0.6567 | 0.7338 | 0.6931 | 0.6749 | 0.8411 |
7.1611 | 5.0 | 225 | 3.6002 | 0.6822 | 0.7647 | 0.7211 | 0.6985 | 0.8569 |
5.4223 | 6.0 | 270 | 3.6653 | 0.7199 | 0.7687 | 0.7435 | 0.7191 | 0.8578 |
4.1914 | 7.0 | 315 | 3.6212 | 0.7156 | 0.7711 | 0.7423 | 0.7201 | 0.8628 |
3.4499 | 8.0 | 360 | 3.9382 | 0.7019 | 0.7836 | 0.7405 | 0.7078 | 0.8662 |
2.8646 | 9.0 | 405 | 4.0638 | 0.7059 | 0.7856 | 0.7436 | 0.7093 | 0.8652 |
2.3159 | 10.0 | 450 | 4.1920 | 0.7117 | 0.7751 | 0.7421 | 0.7127 | 0.8682 |
2.3159 | 11.0 | 495 | 4.3891 | 0.7110 | 0.7736 | 0.7410 | 0.7142 | 0.8731 |
1.8513 | 12.0 | 540 | 4.4429 | 0.7295 | 0.7821 | 0.7549 | 0.7231 | 0.8741 |
1.6003 | 13.0 | 585 | 4.7107 | 0.7317 | 0.7841 | 0.7570 | 0.7211 | 0.8775 |
1.3617 | 14.0 | 630 | 4.8732 | 0.7311 | 0.7751 | 0.7525 | 0.7231 | 0.8721 |
1.0467 | 15.0 | 675 | 5.0702 | 0.7230 | 0.7816 | 0.7511 | 0.7211 | 0.8687 |
0.9095 | 16.0 | 720 | 5.1884 | 0.7377 | 0.7935 | 0.7646 | 0.7304 | 0.8775 |
0.7422 | 17.0 | 765 | 5.1458 | 0.7267 | 0.7831 | 0.7538 | 0.7182 | 0.8711 |
0.6169 | 18.0 | 810 | 5.5018 | 0.7326 | 0.7836 | 0.7572 | 0.7231 | 0.8711 |
0.5221 | 19.0 | 855 | 5.3967 | 0.7375 | 0.7801 | 0.7582 | 0.7260 | 0.8687 |
Framework versions
- Transformers 4.55.0
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for aiface/xlm-roberta-large_massive_crf_v1
Base model
FacebookAI/xlm-roberta-large