Edit model card

scenario-TCR-XLMV-XCOPA-1_data-xcopa_all

This model is a fine-tuned version of facebook/xlm-v-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6931
  • Accuracy: 0.5592
  • F1: 0.5289

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.38 5 0.6932 0.4917 0.4383
No log 0.77 10 0.6931 0.5192 0.5064
No log 1.15 15 0.6931 0.5017 0.4613
No log 1.54 20 0.6932 0.4942 0.4576
No log 1.92 25 0.6931 0.505 0.4629
No log 2.31 30 0.6931 0.5 0.4643
No log 2.69 35 0.6931 0.4892 0.4580
No log 3.08 40 0.6931 0.4833 0.4552
No log 3.46 45 0.6932 0.4967 0.4588
No log 3.85 50 0.6931 0.5042 0.4711
No log 4.23 55 0.6931 0.5108 0.4846
No log 4.62 60 0.6932 0.4875 0.4591
No log 5.0 65 0.6931 0.4958 0.4641
No log 5.38 70 0.6931 0.4933 0.4777
No log 5.77 75 0.6931 0.5075 0.4901
No log 6.15 80 0.6931 0.4833 0.4464
No log 6.54 85 0.6931 0.5175 0.4917
No log 6.92 90 0.6931 0.4442 0.4225
No log 7.31 95 0.6931 0.4583 0.4377
No log 7.69 100 0.6931 0.5192 0.4978
No log 8.08 105 0.6931 0.5425 0.5230
No log 8.46 110 0.6931 0.535 0.5122
No log 8.85 115 0.6931 0.545 0.5194
No log 9.23 120 0.6931 0.5492 0.5259
No log 9.62 125 0.6931 0.535 0.5114
No log 10.0 130 0.6931 0.5475 0.5233
No log 10.38 135 0.6931 0.5525 0.5269
No log 10.77 140 0.6931 0.5458 0.5223
No log 11.15 145 0.6931 0.5392 0.5145
No log 11.54 150 0.6931 0.5483 0.5246
No log 11.92 155 0.6931 0.5342 0.5084
No log 12.31 160 0.6931 0.54 0.5158
No log 12.69 165 0.6931 0.5375 0.5084
No log 13.08 170 0.6931 0.5433 0.5133
No log 13.46 175 0.6931 0.5333 0.5096
No log 13.85 180 0.6931 0.5458 0.5215
No log 14.23 185 0.6931 0.5508 0.5259
No log 14.62 190 0.6931 0.5433 0.5168
No log 15.0 195 0.6931 0.55 0.5280
No log 15.38 200 0.6931 0.5442 0.5231
No log 15.77 205 0.6931 0.55 0.5280
No log 16.15 210 0.6931 0.5458 0.5257
No log 16.54 215 0.6931 0.5392 0.5195
No log 16.92 220 0.6931 0.5367 0.5165
No log 17.31 225 0.6931 0.5433 0.5235
No log 17.69 230 0.6931 0.55 0.5271
No log 18.08 235 0.6931 0.5425 0.5222
No log 18.46 240 0.6931 0.5417 0.5158
No log 18.85 245 0.6931 0.4983 0.4719
No log 19.23 250 0.6931 0.5483 0.5237
No log 19.62 255 0.6931 0.5425 0.5230
No log 20.0 260 0.6931 0.5467 0.5220
No log 20.38 265 0.6931 0.5467 0.5220
No log 20.77 270 0.6931 0.5508 0.5251
No log 21.15 275 0.6931 0.555 0.5283
No log 21.54 280 0.6931 0.5533 0.5257
No log 21.92 285 0.6931 0.555 0.5283
No log 22.31 290 0.6931 0.5533 0.5298
No log 22.69 295 0.6931 0.5517 0.5281
No log 23.08 300 0.6931 0.5567 0.5325
No log 23.46 305 0.6931 0.55 0.5288
No log 23.85 310 0.6931 0.5475 0.5233
No log 24.23 315 0.6931 0.5467 0.5220
No log 24.62 320 0.6931 0.55 0.5246
No log 25.0 325 0.6931 0.5483 0.5212
No log 25.38 330 0.6931 0.5467 0.5203
No log 25.77 335 0.6931 0.5483 0.5204
No log 26.15 340 0.6931 0.5492 0.5225
No log 26.54 345 0.6931 0.5492 0.5250
No log 26.92 350 0.6931 0.5542 0.5295
No log 27.31 355 0.6931 0.5567 0.5350
No log 27.69 360 0.6931 0.5533 0.5290
No log 28.08 365 0.6931 0.5558 0.5296
No log 28.46 370 0.6931 0.5542 0.5270
No log 28.85 375 0.6931 0.5383 0.5166
No log 29.23 380 0.6931 0.5483 0.5220
No log 29.62 385 0.6931 0.5475 0.5190
No log 30.0 390 0.6931 0.5483 0.5212
No log 30.38 395 0.6931 0.5208 0.4871
No log 30.77 400 0.6931 0.4867 0.4690
No log 31.15 405 0.6931 0.485 0.4663
No log 31.54 410 0.6931 0.455 0.4313
No log 31.92 415 0.6931 0.4608 0.4369
No log 32.31 420 0.6931 0.4617 0.4421
No log 32.69 425 0.6931 0.5258 0.4942
No log 33.08 430 0.6931 0.5608 0.5340
No log 33.46 435 0.6931 0.5583 0.5310
No log 33.85 440 0.6931 0.56 0.5352
No log 34.23 445 0.6931 0.5567 0.5325
No log 34.62 450 0.6931 0.5525 0.5277
No log 35.0 455 0.6931 0.5542 0.5303
No log 35.38 460 0.6931 0.5633 0.5379
No log 35.77 465 0.6931 0.5542 0.5295
No log 36.15 470 0.6931 0.5567 0.5309
No log 36.54 475 0.6931 0.555 0.5291
No log 36.92 480 0.6931 0.5575 0.5330
No log 37.31 485 0.6931 0.5517 0.5256
No log 37.69 490 0.6931 0.545 0.5168
No log 38.08 495 0.6931 0.54 0.5132
0.6936 38.46 500 0.6931 0.55 0.5238
0.6936 38.85 505 0.6931 0.5425 0.512
0.6936 39.23 510 0.6931 0.54 0.5106
0.6936 39.62 515 0.6931 0.5242 0.4906
0.6936 40.0 520 0.6931 0.5292 0.4978
0.6936 40.38 525 0.6931 0.53 0.5009
0.6936 40.77 530 0.6931 0.5308 0.5031
0.6936 41.15 535 0.6931 0.5425 0.5205
0.6936 41.54 540 0.6931 0.535 0.5088
0.6936 41.92 545 0.6931 0.5342 0.5084
0.6936 42.31 550 0.6931 0.5425 0.5205
0.6936 42.69 555 0.6931 0.5475 0.5241
0.6936 43.08 560 0.6931 0.5517 0.5264
0.6936 43.46 565 0.6931 0.5592 0.5339
0.6936 43.85 570 0.6931 0.5625 0.5350
0.6936 44.23 575 0.6931 0.5625 0.5358
0.6936 44.62 580 0.6931 0.5617 0.5337
0.6936 45.0 585 0.6931 0.5633 0.5355
0.6936 45.38 590 0.6931 0.56 0.5344
0.6936 45.77 595 0.6931 0.5625 0.5350
0.6936 46.15 600 0.6931 0.555 0.5258
0.6936 46.54 605 0.6931 0.5625 0.5350
0.6936 46.92 610 0.6931 0.5592 0.5289

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
2
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for haryoaw/scenario-TCR-XLMV-XCOPA-1_data-xcopa_all

Finetuned
(41)
this model