roberta-2020-Q1-filtered

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7087

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1400
  • training_steps: 2400000

Training results

Training Loss Epoch Step Validation Loss
No log 0.07 8000 2.9780
3.1296 0.13 16000 2.8948
3.1296 0.2 24000 2.8590
2.9018 0.26 32000 2.8033
2.9018 0.33 40000 2.7938
2.8331 0.39 48000 2.7695
2.8331 0.46 56000 2.7614
2.7723 0.52 64000 2.7417
2.7723 0.59 72000 2.7249
2.75 0.65 80000 2.7202
2.75 0.72 88000 2.7112
2.735 0.78 96000 2.7229
2.735 0.85 104000 2.7371
2.7137 0.91 112000 2.7059
2.7137 0.98 120000 2.7121
2.7155 1.04 128000 2.7249
2.7155 1.11 136000 2.7131
2.7152 1.17 144000 2.7000
2.7152 1.24 152000 2.7030
2.7151 1.3 160000 2.7214
2.7151 1.37 168000 2.7076
2.7166 1.44 176000 2.7106
2.7166 1.5 184000 2.7197
2.7144 1.57 192000 2.7101
2.7144 1.63 200000 2.7235
2.7179 1.7 208000 2.7066
2.7179 1.76 216000 2.7283
2.7231 1.83 224000 2.7203
2.7231 1.89 232000 2.7111
2.7284 1.96 240000 2.7217
2.7284 2.02 248000 2.7251
2.7242 2.09 256000 2.7181
2.7242 2.15 264000 2.7238
2.7171 2.22 272000 2.7488
2.7171 2.28 280000 2.7315
2.7312 2.35 288000 2.7469
2.7312 2.41 296000 2.7363
2.7386 2.48 304000 2.7398
2.7386 2.54 312000 2.7477
2.7457 2.61 320000 2.7536
2.7457 2.67 328000 2.7483
2.7496 2.74 336000 2.7529
2.7496 2.8 344000 2.7492
2.7521 2.87 352000 2.7612
2.7521 2.94 360000 2.7701
2.7649 3.0 368000 2.7705
2.7649 3.07 376000 2.7828
2.7516 3.13 384000 2.7680
2.7516 3.2 392000 2.7843
2.762 3.26 400000 2.7916
2.762 3.33 408000 2.7692
2.7789 3.39 416000 2.7834
2.7789 3.46 424000 2.7788
2.7879 3.52 432000 2.8037
2.7879 3.59 440000 2.7919
2.7853 3.65 448000 2.8077
2.7853 3.72 456000 2.7903
2.7976 3.78 464000 2.8109
2.7976 3.85 472000 2.7957
2.789 3.91 480000 2.8023
2.789 3.98 488000 2.8126
2.8089 4.04 496000 2.8154
2.8089 4.11 504000 2.8123
2.7915 4.17 512000 2.8146
2.7915 4.24 520000 2.8250
2.8094 4.31 528000 2.8206
2.8094 4.37 536000 2.8182
2.8196 4.44 544000 2.8351
2.8196 4.5 552000 2.8394
2.8316 4.57 560000 2.8397
2.8316 4.63 568000 2.8403
2.8444 4.7 576000 2.8351
2.8444 4.76 584000 2.8574
2.833 4.83 592000 2.8617
2.833 4.89 600000 2.8578
2.839 4.96 608000 2.8577
2.839 5.02 616000 2.8727
2.8427 5.09 624000 2.8586
2.8427 5.15 632000 2.8808
2.8599 5.22 640000 2.8960
2.8599 5.28 648000 2.8883
2.8694 5.35 656000 2.8885
2.8694 5.41 664000 2.8873
2.8626 5.48 672000 2.8930
2.8626 5.54 680000 2.8988
2.8921 5.61 688000 2.9117
2.8921 5.68 696000 2.9122
2.8884 5.74 704000 2.9001
2.8884 5.81 712000 2.9094
2.8974 5.87 720000 2.9110
2.8974 5.94 728000 2.9045
2.903 6.0 736000 2.9337
2.903 6.07 744000 2.9316
2.9057 6.13 752000 2.9447
2.9057 6.2 760000 2.9363
2.9146 6.26 768000 2.9438
2.9146 6.33 776000 2.9475
2.9221 6.39 784000 2.9394
2.9221 6.46 792000 2.9371
2.9316 6.52 800000 2.9494
2.9316 6.59 808000 2.9727
2.9421 6.65 816000 2.9759
2.9421 6.72 824000 2.9665
2.9538 6.78 832000 2.9650
2.9538 6.85 840000 2.9761
2.9594 6.91 848000 2.9901
2.9594 6.98 856000 2.9732
2.9564 7.05 864000 2.9897
2.9564 7.11 872000 2.9801
2.9561 7.18 880000 2.9839
2.9561 7.24 888000 2.9888
2.9669 7.31 896000 3.0000
2.9669 7.37 904000 2.9786
2.9649 7.44 912000 2.9946
2.9649 7.5 920000 3.0002
2.9665 7.57 928000 2.9960
2.9665 7.63 936000 3.0068
2.9708 7.7 944000 2.9938
2.9708 7.76 952000 3.0126
2.981 7.83 960000 2.9959
2.981 7.89 968000 2.9960
2.9805 7.96 976000 2.9919
2.9805 8.02 984000 3.0058
2.9705 8.09 992000 3.0232
2.9705 8.15 1000000 3.0047
2.9715 8.22 1008000 3.0069
2.9715 8.28 1016000 3.0019
2.9695 8.35 1024000 3.0216
2.9695 8.41 1032000 3.0219
2.9762 8.48 1040000 3.0182
2.9762 8.55 1048000 3.0332
2.9786 8.61 1056000 3.0017
2.9786 8.68 1064000 3.0236
2.9889 8.74 1072000 3.0273
2.9889 8.81 1080000 3.0197
2.9842 8.87 1088000 3.0376
2.9842 8.94 1096000 3.0323
2.9912 9.0 1104000 3.0317
2.9912 9.07 1112000 3.0225
2.9919 9.13 1120000 3.0361
2.9919 9.2 1128000 3.0432
2.9872 9.26 1136000 3.0307
2.9872 9.33 1144000 3.0482
2.9823 9.39 1152000 3.0354
2.9823 9.46 1160000 3.0419
2.9882 9.52 1168000 3.0567
2.9882 9.59 1176000 3.0395
3.0079 9.65 1184000 3.0572
3.0079 9.72 1192000 3.0403
3.0243 9.78 1200000 3.0472
3.0243 9.85 1208000 3.0523
3.0127 9.92 1216000 3.0534
3.0127 9.98 1224000 3.0434
3.0106 10.05 1232000 3.0687
3.0106 10.11 1240000 3.0678
3.0063 10.18 1248000 3.0652
3.0063 10.24 1256000 3.0768
3.0187 10.31 1264000 3.0692
3.0187 10.37 1272000 3.0621
3.0202 10.44 1280000 3.0663
3.0202 10.5 1288000 3.0537
3.0219 10.57 1296000 3.0725
3.0219 10.63 1304000 3.0664
3.0232 10.7 1312000 3.0724
3.0232 10.76 1320000 3.0476
3.0247 10.83 1328000 3.0729
3.0247 10.89 1336000 3.0646
3.0335 10.96 1344000 3.0604
3.0335 11.02 1352000 3.0631
3.0182 11.09 1360000 3.0669
3.0182 11.15 1368000 3.0626
3.0124 11.22 1376000 3.0535
3.0124 11.29 1384000 3.0768
3.016 11.35 1392000 3.0615
3.016 11.42 1400000 3.0689
3.0133 11.48 1408000 3.0699
3.0133 11.55 1416000 3.0647
3.0227 11.61 1424000 3.0705
3.0227 11.68 1432000 3.0706
3.0267 11.74 1440000 3.0694
3.0267 11.81 1448000 3.0721
3.021 11.87 1456000 3.0690
3.021 11.94 1464000 3.0603
3.0144 12.0 1472000 3.0658
3.0144 12.07 1480000 3.0720
3.0204 12.13 1488000 3.0668
3.0204 12.2 1496000 3.0773
3.0085 12.26 1504000 3.0848
3.0085 12.33 1512000 3.0568
3.0146 12.39 1520000 3.0783
3.0146 12.46 1528000 3.0736
3.02 12.52 1536000 3.0534
3.02 12.59 1544000 3.0684
3.0229 12.65 1552000 3.0767
3.0229 12.72 1560000 3.0569
3.0152 12.79 1568000 3.0788
3.0152 12.85 1576000 3.0663
3.02 12.92 1584000 3.0670
3.02 12.98 1592000 3.0683
3.0128 13.05 1600000 3.0718
3.0128 13.11 1608000 3.0847
3.016 13.18 1616000 3.0664
3.016 13.24 1624000 3.0688
3.0007 13.31 1632000 3.0741
3.0007 13.37 1640000 3.0663
3.0241 13.44 1648000 3.0607
3.0241 13.5 1656000 3.0635
3.0103 13.57 1664000 3.0731
3.0103 13.63 1672000 3.0649
3.0188 13.7 1680000 3.0587
3.0188 13.76 1688000 3.0704
3.0217 13.83 1696000 3.0664
3.0217 13.89 1704000 3.0627
3.0282 13.96 1712000 3.0714
3.0282 14.02 1720000 3.0688
3.0166 14.09 1728000 3.0521
3.0166 14.16 1736000 3.0538
3.0134 14.22 1744000 3.0641
3.0134 14.29 1752000 3.0639
3.0032 14.35 1760000 3.0588
3.0032 14.42 1768000 3.0646
3.0136 14.48 1776000 3.0629
3.0136 14.55 1784000 3.0578
3.0086 14.61 1792000 3.0529
3.0086 14.68 1800000 3.0615
3.019 14.74 1808000 3.0566
3.019 14.81 1816000 3.0659
3.024 14.87 1824000 3.0615
3.024 14.94 1832000 3.0530
3.0089 15.0 1840000 3.0797
3.0089 15.07 1848000 3.0700
3.0174 15.13 1856000 3.0748
3.0174 15.2 1864000 3.0643
3.0176 15.26 1872000 3.0628
3.0176 15.33 1880000 3.0630
3.0164 15.39 1888000 3.0722
3.0164 15.46 1896000 3.0744
3.0302 15.53 1904000 3.0739
3.0302 15.59 1912000 3.0700
3.0204 15.66 1920000 3.0751
3.0204 15.72 1928000 3.0598
3.0147 15.79 1936000 3.0522
3.0147 15.85 1944000 3.0655
3.0245 15.92 1952000 3.0569
3.0245 15.98 1960000 3.0623
3.0069 16.05 1968000 3.0600
3.0069 16.11 1976000 3.0639
3.0068 16.18 1984000 3.0775
3.0068 16.24 1992000 3.0669
3.0275 16.31 2000000 3.0627
3.0275 16.37 2008000 3.0645
3.0164 16.44 2016000 3.0667
3.0164 16.5 2024000 3.0490
3.0148 16.57 2032000 3.0618
3.0148 16.63 2040000 3.0545
3.022 16.7 2048000 3.0651
3.022 16.76 2056000 3.0687
3.0235 16.83 2064000 3.0516
3.0235 16.89 2072000 3.0761
3.0194 16.96 2080000 3.0807
3.0194 17.03 2088000 3.0601
3.0142 17.09 2096000 3.0721
3.0142 17.16 2104000 3.0653
3.0183 17.22 2112000 3.0617
3.0183 17.29 2120000 3.0622
3.0092 17.35 2128000 3.0682
3.0092 17.42 2136000 3.0732
3.0071 17.48 2144000 3.0763
3.0071 17.55 2152000 3.0675
3.0272 17.61 2160000 3.0671
3.0272 17.68 2168000 3.0622
3.0235 17.74 2176000 3.0789
3.0235 17.81 2184000 3.0623
3.0179 17.87 2192000 3.0784
3.0179 17.94 2200000 3.0629
3.0209 18.0 2208000 3.0731
3.0209 18.07 2216000 3.0946
3.0237 18.13 2224000 3.0653
3.0237 18.2 2232000 3.0590
3.0164 18.26 2240000 3.0707
3.0164 18.33 2248000 3.0546
3.0206 18.4 2256000 3.0742
3.0206 18.46 2264000 3.0793
3.0138 18.53 2272000 3.0560
3.0138 18.59 2280000 3.0870
3.0377 18.66 2288000 3.0742
3.0377 18.72 2296000 3.0676
3.0227 18.79 2304000 3.0625
3.0227 18.85 2312000 3.0736
3.0359 18.92 2320000 3.0801
3.0359 18.98 2328000 3.0710
3.0248 19.05 2336000 3.0692
3.0248 19.11 2344000 3.0677
3.0235 19.18 2352000 3.0896
3.0235 19.24 2360000 3.0778
3.0187 19.31 2368000 3.0700
3.0187 19.37 2376000 3.0743
3.0189 19.44 2384000 3.0780
3.0189 19.5 2392000 3.0867
3.0184 19.57 2400000 3.0793

Framework versions

  • Transformers 4.35.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
11
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for DouglasPontes/roberta-2020-Q1-filtered

Finetuned
(1665)
this model