--- library_name: transformers license: mit base_model: FacebookAI/xlm-roberta-large tags: - generated_from_trainer metrics: - accuracy model-index: - name: xlm_roberta_large_test_linsearch_only_abstract results: [] --- # xlm_roberta_large_test_linsearch_only_abstract This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.3349 - Accuracy: 0.6504 - F1 Macro: 0.6037 - Precision Macro: 0.6113 - Recall Macro: 0.6008 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | Precision Macro | Recall Macro | |:-------------:|:------:|:-----:|:---------------:|:--------:|:--------:|:---------------:|:------------:| | 1.2114 | 1.0 | 4931 | 1.2224 | 0.6245 | 0.5249 | 0.5555 | 0.5501 | | 1.0885 | 2.0 | 9862 | 1.1036 | 0.6427 | 0.5596 | 0.6009 | 0.5591 | | 0.9781 | 3.0 | 14793 | 1.0828 | 0.6491 | 0.5760 | 0.6188 | 0.5811 | | 0.8621 | 4.0 | 19724 | 1.0956 | 0.6569 | 0.5979 | 0.6363 | 0.6014 | | 0.7267 | 5.0 | 24655 | 1.0899 | 0.6626 | 0.5970 | 0.6088 | 0.5941 | | 0.6066 | 6.0 | 29586 | 1.2078 | 0.6517 | 0.5928 | 0.6177 | 0.5848 | | 0.4627 | 7.0 | 34517 | 1.3349 | 0.6504 | 0.6037 | 0.6113 | 0.6008 | | 0.3238 | 8.0 | 39448 | 1.5315 | 0.6398 | 0.5951 | 0.6064 | 0.5891 | | 0.224 | 9.0 | 44379 | 1.8234 | 0.6438 | 0.5936 | 0.5994 | 0.5897 | | 0.149 | 9.9981 | 49300 | 2.0762 | 0.6442 | 0.6011 | 0.6024 | 0.6007 | ### Framework versions - Transformers 4.50.1 - Pytorch 2.5.1+cu121 - Datasets 3.4.1 - Tokenizers 0.21.1