segformer-finetuned-4ss1st3r_s3gs3m_24Jan_rojo-10k-steps
This model is a fine-tuned version of nvidia/mit-b0 on the blzncz/4ss1st3r_s3gs3m_24Jan_rojo dataset. It achieves the following results on the evaluation set:
- Loss: 1.2098
- Mean Iou: 0.3648
- Mean Accuracy: 0.6821
- Overall Accuracy: 0.6947
- Accuracy Bg: nan
- Accuracy Fallo cohesivo: 0.7354
- Accuracy Fallo malla: 0.6052
- Accuracy Fallo adhesivo: 0.9884
- Accuracy Fallo burbuja: 0.3995
- Iou Bg: 0.0
- Iou Fallo cohesivo: 0.5920
- Iou Fallo malla: 0.5774
- Iou Fallo adhesivo: 0.2950
- Iou Fallo burbuja: 0.3598
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Bg | Accuracy Fallo cohesivo | Accuracy Fallo malla | Accuracy Fallo adhesivo | Accuracy Fallo burbuja | Iou Bg | Iou Fallo cohesivo | Iou Fallo malla | Iou Fallo adhesivo | Iou Fallo burbuja |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.5778 | 1.0 | 114 | 0.8590 | 0.2588 | 0.5626 | 0.6271 | nan | 0.3415 | 0.9120 | 0.9748 | 0.0221 | 0.0 | 0.3339 | 0.6211 | 0.3168 | 0.0220 |
0.3326 | 2.0 | 228 | 0.6845 | 0.3570 | 0.6755 | 0.7131 | nan | 0.5232 | 0.8953 | 0.9911 | 0.2921 | 0.0 | 0.5003 | 0.6924 | 0.3417 | 0.2503 |
0.2636 | 3.0 | 342 | 0.6662 | 0.3896 | 0.7107 | 0.7344 | nan | 0.7666 | 0.6622 | 0.9838 | 0.4304 | 0.0 | 0.6088 | 0.6188 | 0.4014 | 0.3191 |
0.2505 | 4.0 | 456 | 0.7666 | 0.3732 | 0.7408 | 0.6807 | nan | 0.4141 | 0.9417 | 0.9778 | 0.6297 | 0.0 | 0.4065 | 0.6276 | 0.4337 | 0.3980 |
0.2306 | 5.0 | 570 | 0.4680 | 0.4690 | 0.7389 | 0.8099 | nan | 0.7461 | 0.8649 | 0.9742 | 0.3705 | 0.0 | 0.6711 | 0.7095 | 0.6349 | 0.3294 |
0.1998 | 6.0 | 684 | 0.5711 | 0.4449 | 0.7494 | 0.7824 | nan | 0.8528 | 0.6732 | 0.9865 | 0.4850 | 0.0 | 0.6781 | 0.6338 | 0.5320 | 0.3807 |
0.2062 | 7.0 | 798 | 0.6403 | 0.4070 | 0.7437 | 0.7283 | nan | 0.5736 | 0.8683 | 0.9881 | 0.5447 | 0.0 | 0.5300 | 0.6452 | 0.4613 | 0.3987 |
0.182 | 8.0 | 912 | 0.5934 | 0.4344 | 0.7309 | 0.7770 | nan | 0.8171 | 0.7036 | 0.9840 | 0.4190 | 0.0 | 0.6640 | 0.6485 | 0.4916 | 0.3681 |
0.178 | 9.0 | 1026 | 0.7158 | 0.3811 | 0.6915 | 0.7313 | nan | 0.7292 | 0.6984 | 0.9913 | 0.3472 | 0.0 | 0.6148 | 0.6404 | 0.3348 | 0.3153 |
0.1568 | 10.0 | 1140 | 0.5892 | 0.4169 | 0.6970 | 0.7873 | nan | 0.8088 | 0.7398 | 0.9855 | 0.2538 | 0.0 | 0.6770 | 0.6664 | 0.5004 | 0.2407 |
0.1576 | 11.0 | 1254 | 0.6419 | 0.4228 | 0.7177 | 0.7652 | nan | 0.7970 | 0.7001 | 0.9805 | 0.3931 | 0.0 | 0.6509 | 0.6318 | 0.4701 | 0.3614 |
0.1667 | 12.0 | 1368 | 0.6563 | 0.4060 | 0.7369 | 0.7605 | nan | 0.7409 | 0.7517 | 0.9871 | 0.4681 | 0.0 | 0.6326 | 0.6731 | 0.4103 | 0.3139 |
0.1436 | 13.0 | 1482 | 0.9148 | 0.3864 | 0.7079 | 0.7187 | nan | 0.6666 | 0.7400 | 0.9900 | 0.4352 | 0.0 | 0.6025 | 0.6632 | 0.2829 | 0.3835 |
0.1469 | 14.0 | 1596 | 0.6680 | 0.4166 | 0.7216 | 0.7689 | nan | 0.7843 | 0.7225 | 0.9861 | 0.3936 | 0.0 | 0.6703 | 0.6608 | 0.3946 | 0.3571 |
0.1288 | 15.0 | 1710 | 0.8170 | 0.3765 | 0.6849 | 0.7164 | nan | 0.8269 | 0.5509 | 0.9859 | 0.3759 | 0.0 | 0.6242 | 0.5368 | 0.3815 | 0.3398 |
0.1243 | 16.0 | 1824 | 0.8197 | 0.4034 | 0.7169 | 0.7375 | nan | 0.8456 | 0.5776 | 0.9842 | 0.4602 | 0.0 | 0.6517 | 0.5582 | 0.4078 | 0.3991 |
0.1208 | 17.0 | 1938 | 0.7927 | 0.3848 | 0.6774 | 0.7295 | nan | 0.8592 | 0.5460 | 0.9810 | 0.3233 | 0.0 | 0.6359 | 0.5256 | 0.4647 | 0.2978 |
0.115 | 18.0 | 2052 | 1.1226 | 0.3376 | 0.6484 | 0.6727 | nan | 0.7053 | 0.5900 | 0.9905 | 0.3079 | 0.0 | 0.5659 | 0.5688 | 0.2673 | 0.2860 |
0.1138 | 19.0 | 2166 | 0.8244 | 0.4055 | 0.7099 | 0.7446 | nan | 0.8200 | 0.6248 | 0.9833 | 0.4115 | 0.0 | 0.6364 | 0.5964 | 0.4287 | 0.3659 |
0.1144 | 20.0 | 2280 | 0.5964 | 0.4493 | 0.7179 | 0.8034 | nan | 0.8594 | 0.7188 | 0.9808 | 0.3127 | 0.0 | 0.6995 | 0.6608 | 0.5990 | 0.2873 |
0.108 | 21.0 | 2394 | 0.6545 | 0.4418 | 0.7348 | 0.7902 | nan | 0.8263 | 0.7241 | 0.9835 | 0.4053 | 0.0 | 0.6832 | 0.6653 | 0.5023 | 0.3582 |
0.109 | 22.0 | 2508 | 0.9552 | 0.3775 | 0.6990 | 0.7058 | nan | 0.6835 | 0.6906 | 0.9894 | 0.4325 | 0.0 | 0.5907 | 0.6391 | 0.2756 | 0.3819 |
0.0987 | 23.0 | 2622 | 0.7971 | 0.3974 | 0.7133 | 0.7453 | nan | 0.7451 | 0.7124 | 0.9871 | 0.4084 | 0.0 | 0.6281 | 0.6560 | 0.3577 | 0.3452 |
0.0977 | 24.0 | 2736 | 0.9783 | 0.3718 | 0.6984 | 0.7001 | nan | 0.5950 | 0.7793 | 0.9916 | 0.4276 | 0.0 | 0.5491 | 0.6786 | 0.2620 | 0.3692 |
0.0954 | 25.0 | 2850 | 0.9562 | 0.3856 | 0.6981 | 0.7352 | nan | 0.7102 | 0.7294 | 0.9904 | 0.3623 | 0.0 | 0.6355 | 0.6603 | 0.2988 | 0.3332 |
0.0928 | 26.0 | 2964 | 0.9185 | 0.3787 | 0.6870 | 0.7355 | nan | 0.7815 | 0.6491 | 0.9847 | 0.3327 | 0.0 | 0.6569 | 0.6184 | 0.3151 | 0.3028 |
0.0918 | 27.0 | 3078 | 0.9617 | 0.3845 | 0.6916 | 0.7175 | nan | 0.8211 | 0.5605 | 0.9809 | 0.4037 | 0.0 | 0.6123 | 0.5462 | 0.3994 | 0.3648 |
0.0801 | 28.0 | 3192 | 1.1167 | 0.3672 | 0.6811 | 0.7091 | nan | 0.7352 | 0.6393 | 0.9927 | 0.3570 | 0.0 | 0.6151 | 0.6141 | 0.2816 | 0.3250 |
0.0852 | 29.0 | 3306 | 0.8549 | 0.4217 | 0.7108 | 0.7596 | nan | 0.8684 | 0.6040 | 0.9848 | 0.3862 | 0.0 | 0.6576 | 0.5808 | 0.5146 | 0.3553 |
0.0816 | 30.0 | 3420 | 0.9536 | 0.3902 | 0.7034 | 0.7366 | nan | 0.7752 | 0.6573 | 0.9885 | 0.3926 | 0.0 | 0.6415 | 0.6301 | 0.3274 | 0.3517 |
0.0876 | 31.0 | 3534 | 1.0597 | 0.3873 | 0.7065 | 0.7158 | nan | 0.7490 | 0.6374 | 0.9920 | 0.4475 | 0.0 | 0.6117 | 0.6160 | 0.3051 | 0.4035 |
0.0811 | 32.0 | 3648 | 0.8829 | 0.4038 | 0.7077 | 0.7569 | nan | 0.7949 | 0.6829 | 0.9860 | 0.3669 | 0.0 | 0.6442 | 0.6498 | 0.3943 | 0.3304 |
0.0789 | 33.0 | 3762 | 0.9615 | 0.4002 | 0.7104 | 0.7436 | nan | 0.7890 | 0.6575 | 0.9884 | 0.4066 | 0.0 | 0.6344 | 0.6308 | 0.3702 | 0.3658 |
0.0752 | 34.0 | 3876 | 0.7799 | 0.4297 | 0.7280 | 0.7806 | nan | 0.8279 | 0.6991 | 0.9873 | 0.3975 | 0.0 | 0.6787 | 0.6605 | 0.4458 | 0.3634 |
0.0731 | 35.0 | 3990 | 0.9285 | 0.4061 | 0.7025 | 0.7531 | nan | 0.8595 | 0.5987 | 0.9898 | 0.3619 | 0.0 | 0.6579 | 0.5797 | 0.4600 | 0.3330 |
0.0752 | 36.0 | 4104 | 0.9218 | 0.4112 | 0.7276 | 0.7463 | nan | 0.7632 | 0.6926 | 0.9880 | 0.4667 | 0.0 | 0.6393 | 0.6507 | 0.3462 | 0.4200 |
0.0701 | 37.0 | 4218 | 0.8808 | 0.4105 | 0.7184 | 0.7562 | nan | 0.8090 | 0.6635 | 0.9893 | 0.4119 | 0.0 | 0.6569 | 0.6342 | 0.3876 | 0.3740 |
0.0717 | 38.0 | 4332 | 1.1090 | 0.3748 | 0.6881 | 0.7166 | nan | 0.7554 | 0.6334 | 0.9905 | 0.3729 | 0.0 | 0.6272 | 0.6069 | 0.2969 | 0.3433 |
0.0716 | 39.0 | 4446 | 0.9456 | 0.4018 | 0.7064 | 0.7528 | nan | 0.8217 | 0.6418 | 0.9872 | 0.3747 | 0.0 | 0.6638 | 0.6131 | 0.3863 | 0.3456 |
0.069 | 40.0 | 4560 | 0.8462 | 0.4157 | 0.7038 | 0.7697 | nan | 0.8656 | 0.6316 | 0.9856 | 0.3324 | 0.0 | 0.6750 | 0.6041 | 0.4917 | 0.3078 |
0.07 | 41.0 | 4674 | 0.9715 | 0.3843 | 0.6886 | 0.7393 | nan | 0.8006 | 0.6353 | 0.9896 | 0.3289 | 0.0 | 0.6420 | 0.6104 | 0.3633 | 0.3056 |
0.0649 | 42.0 | 4788 | 0.9114 | 0.3997 | 0.7066 | 0.7592 | nan | 0.7682 | 0.7185 | 0.9917 | 0.3478 | 0.0 | 0.6613 | 0.6728 | 0.3449 | 0.3196 |
0.0665 | 43.0 | 4902 | 1.1847 | 0.3662 | 0.6812 | 0.6981 | nan | 0.7131 | 0.6389 | 0.9912 | 0.3817 | 0.0 | 0.5853 | 0.6122 | 0.2832 | 0.3504 |
0.0646 | 44.0 | 5016 | 1.1242 | 0.3744 | 0.6906 | 0.7086 | nan | 0.6930 | 0.6870 | 0.9891 | 0.3932 | 0.0 | 0.5902 | 0.6495 | 0.2770 | 0.3555 |
0.0662 | 45.0 | 5130 | 1.1017 | 0.3605 | 0.6735 | 0.7023 | nan | 0.7333 | 0.6261 | 0.9906 | 0.3439 | 0.0 | 0.5997 | 0.5996 | 0.2906 | 0.3126 |
0.0644 | 46.0 | 5244 | 1.2989 | 0.3470 | 0.6600 | 0.6735 | nan | 0.6567 | 0.6473 | 0.9917 | 0.3445 | 0.0 | 0.5607 | 0.6182 | 0.2377 | 0.3185 |
0.0595 | 47.0 | 5358 | 1.0764 | 0.3833 | 0.6982 | 0.7241 | nan | 0.7650 | 0.6389 | 0.9932 | 0.3957 | 0.0 | 0.6345 | 0.6134 | 0.3071 | 0.3618 |
0.0603 | 48.0 | 5472 | 1.0871 | 0.3692 | 0.6813 | 0.7128 | nan | 0.7153 | 0.6718 | 0.9884 | 0.3497 | 0.0 | 0.6079 | 0.6388 | 0.2797 | 0.3197 |
0.0591 | 49.0 | 5586 | 1.1054 | 0.3800 | 0.6956 | 0.7171 | nan | 0.7103 | 0.6866 | 0.9892 | 0.3963 | 0.0 | 0.6116 | 0.6458 | 0.2816 | 0.3609 |
0.0612 | 50.0 | 5700 | 1.1061 | 0.3652 | 0.6768 | 0.7087 | nan | 0.7394 | 0.6340 | 0.9903 | 0.3435 | 0.0 | 0.6027 | 0.6074 | 0.3009 | 0.3147 |
0.0609 | 51.0 | 5814 | 0.9938 | 0.3742 | 0.6850 | 0.7206 | nan | 0.7555 | 0.6433 | 0.9890 | 0.3523 | 0.0 | 0.6210 | 0.6121 | 0.3143 | 0.3235 |
0.058 | 52.0 | 5928 | 1.0391 | 0.3745 | 0.6836 | 0.7248 | nan | 0.7691 | 0.6374 | 0.9901 | 0.3379 | 0.0 | 0.6275 | 0.6082 | 0.3257 | 0.3109 |
0.0559 | 53.0 | 6042 | 0.9916 | 0.3922 | 0.7033 | 0.7373 | nan | 0.8044 | 0.6249 | 0.9902 | 0.3938 | 0.0 | 0.6429 | 0.6003 | 0.3644 | 0.3537 |
0.0572 | 54.0 | 6156 | 1.0124 | 0.3801 | 0.6907 | 0.7262 | nan | 0.7721 | 0.6371 | 0.9885 | 0.3650 | 0.0 | 0.6284 | 0.6052 | 0.3326 | 0.3341 |
0.0558 | 55.0 | 6270 | 1.0856 | 0.3692 | 0.6823 | 0.7120 | nan | 0.7232 | 0.6604 | 0.9894 | 0.3565 | 0.0 | 0.6094 | 0.6255 | 0.2864 | 0.3246 |
0.058 | 56.0 | 6384 | 1.0581 | 0.3837 | 0.6998 | 0.7212 | nan | 0.7353 | 0.6668 | 0.9910 | 0.4062 | 0.0 | 0.6126 | 0.6269 | 0.3125 | 0.3666 |
0.0518 | 57.0 | 6498 | 1.0176 | 0.3933 | 0.7060 | 0.7362 | nan | 0.7857 | 0.6440 | 0.9884 | 0.4060 | 0.0 | 0.6395 | 0.6127 | 0.3489 | 0.3655 |
0.0537 | 58.0 | 6612 | 1.2001 | 0.3676 | 0.6853 | 0.6947 | nan | 0.7737 | 0.5607 | 0.9884 | 0.4184 | 0.0 | 0.6003 | 0.5391 | 0.3221 | 0.3764 |
0.0552 | 59.0 | 6726 | 0.9751 | 0.3940 | 0.7068 | 0.7314 | nan | 0.8019 | 0.6139 | 0.9870 | 0.4244 | 0.0 | 0.6353 | 0.5871 | 0.3662 | 0.3816 |
0.0538 | 60.0 | 6840 | 1.0382 | 0.3782 | 0.6909 | 0.7203 | nan | 0.7216 | 0.6813 | 0.9895 | 0.3714 | 0.0 | 0.6093 | 0.6389 | 0.3011 | 0.3418 |
0.0528 | 61.0 | 6954 | 1.1785 | 0.3662 | 0.6819 | 0.7019 | nan | 0.7278 | 0.6310 | 0.9904 | 0.3784 | 0.0 | 0.5966 | 0.6013 | 0.2914 | 0.3419 |
0.0531 | 62.0 | 7068 | 1.1054 | 0.3685 | 0.6852 | 0.7026 | nan | 0.7290 | 0.6310 | 0.9899 | 0.3911 | 0.0 | 0.5969 | 0.5981 | 0.2961 | 0.3514 |
0.0522 | 63.0 | 7182 | 1.1271 | 0.3717 | 0.6871 | 0.7094 | nan | 0.7268 | 0.6496 | 0.9906 | 0.3816 | 0.0 | 0.6069 | 0.6148 | 0.2905 | 0.3460 |
0.0507 | 64.0 | 7296 | 1.0440 | 0.3734 | 0.6825 | 0.7242 | nan | 0.7678 | 0.6380 | 0.9884 | 0.3359 | 0.0 | 0.6279 | 0.6043 | 0.3272 | 0.3076 |
0.0519 | 65.0 | 7410 | 1.1191 | 0.3727 | 0.6884 | 0.7102 | nan | 0.7264 | 0.6517 | 0.9911 | 0.3843 | 0.0 | 0.6028 | 0.6156 | 0.2978 | 0.3472 |
0.0502 | 66.0 | 7524 | 1.0089 | 0.3917 | 0.7036 | 0.7408 | nan | 0.7555 | 0.6898 | 0.9896 | 0.3794 | 0.0 | 0.6413 | 0.6472 | 0.3261 | 0.3437 |
0.051 | 67.0 | 7638 | 1.2112 | 0.3672 | 0.6806 | 0.7083 | nan | 0.7352 | 0.6378 | 0.9899 | 0.3593 | 0.0 | 0.6078 | 0.6085 | 0.2918 | 0.3279 |
0.0508 | 68.0 | 7752 | 1.1584 | 0.3702 | 0.6860 | 0.7052 | nan | 0.7202 | 0.6477 | 0.9888 | 0.3875 | 0.0 | 0.5956 | 0.6155 | 0.2902 | 0.3495 |
0.048 | 69.0 | 7866 | 1.1363 | 0.3773 | 0.6922 | 0.7165 | nan | 0.7297 | 0.6628 | 0.9895 | 0.3865 | 0.0 | 0.6158 | 0.6289 | 0.2901 | 0.3518 |
0.0483 | 70.0 | 7980 | 1.1489 | 0.3749 | 0.6916 | 0.7103 | nan | 0.7398 | 0.6367 | 0.9889 | 0.4011 | 0.0 | 0.6074 | 0.6080 | 0.2994 | 0.3598 |
0.0495 | 71.0 | 8094 | 1.1470 | 0.3774 | 0.6943 | 0.7102 | nan | 0.7454 | 0.6295 | 0.9891 | 0.4131 | 0.0 | 0.6059 | 0.6032 | 0.3053 | 0.3724 |
0.0472 | 72.0 | 8208 | 1.2749 | 0.3597 | 0.6782 | 0.6864 | nan | 0.7291 | 0.5930 | 0.9891 | 0.4017 | 0.0 | 0.5899 | 0.5704 | 0.2771 | 0.3612 |
0.0486 | 73.0 | 8322 | 1.1217 | 0.3773 | 0.6946 | 0.7117 | nan | 0.7549 | 0.6224 | 0.9882 | 0.4128 | 0.0 | 0.6094 | 0.5946 | 0.3150 | 0.3678 |
0.051 | 74.0 | 8436 | 1.1895 | 0.3724 | 0.6889 | 0.7069 | nan | 0.7432 | 0.6247 | 0.9888 | 0.3990 | 0.0 | 0.6052 | 0.5959 | 0.3021 | 0.3590 |
0.0472 | 75.0 | 8550 | 1.2084 | 0.3677 | 0.6847 | 0.7009 | nan | 0.7179 | 0.6399 | 0.9905 | 0.3904 | 0.0 | 0.5979 | 0.6078 | 0.2808 | 0.3522 |
0.0481 | 76.0 | 8664 | 1.1778 | 0.3688 | 0.6841 | 0.7049 | nan | 0.7395 | 0.6244 | 0.9899 | 0.3824 | 0.0 | 0.6024 | 0.5950 | 0.2996 | 0.3469 |
0.0462 | 77.0 | 8778 | 1.2409 | 0.3693 | 0.6863 | 0.7015 | nan | 0.7278 | 0.6297 | 0.9900 | 0.3975 | 0.0 | 0.5964 | 0.5990 | 0.2918 | 0.3593 |
0.0464 | 78.0 | 8892 | 1.2724 | 0.3606 | 0.6792 | 0.6877 | nan | 0.7119 | 0.6158 | 0.9905 | 0.3986 | 0.0 | 0.5825 | 0.5857 | 0.2770 | 0.3578 |
0.0477 | 79.0 | 9006 | 1.2107 | 0.3629 | 0.6797 | 0.6936 | nan | 0.7322 | 0.6063 | 0.9898 | 0.3905 | 0.0 | 0.5928 | 0.5791 | 0.2889 | 0.3540 |
0.0452 | 80.0 | 9120 | 1.1745 | 0.3721 | 0.6889 | 0.7059 | nan | 0.7548 | 0.6087 | 0.9899 | 0.4022 | 0.0 | 0.6080 | 0.5820 | 0.3085 | 0.3620 |
0.0447 | 81.0 | 9234 | 1.2787 | 0.3599 | 0.6776 | 0.6876 | nan | 0.7199 | 0.6063 | 0.9902 | 0.3938 | 0.0 | 0.5857 | 0.5788 | 0.2786 | 0.3566 |
0.0481 | 82.0 | 9348 | 1.2049 | 0.3658 | 0.6836 | 0.6947 | nan | 0.7515 | 0.5865 | 0.9887 | 0.4078 | 0.0 | 0.5956 | 0.5627 | 0.3044 | 0.3660 |
0.0444 | 83.0 | 9462 | 1.1427 | 0.3746 | 0.6930 | 0.7051 | nan | 0.7520 | 0.6100 | 0.9883 | 0.4215 | 0.0 | 0.6042 | 0.5824 | 0.3100 | 0.3763 |
0.0481 | 84.0 | 9576 | 1.1876 | 0.3669 | 0.6848 | 0.6968 | nan | 0.7358 | 0.6094 | 0.9895 | 0.4046 | 0.0 | 0.5944 | 0.5818 | 0.2947 | 0.3636 |
0.046 | 85.0 | 9690 | 1.2264 | 0.3628 | 0.6799 | 0.6928 | nan | 0.7348 | 0.6015 | 0.9885 | 0.3948 | 0.0 | 0.5906 | 0.5746 | 0.2930 | 0.3560 |
0.0472 | 86.0 | 9804 | 1.2377 | 0.3659 | 0.6828 | 0.6967 | nan | 0.7287 | 0.6176 | 0.9890 | 0.3959 | 0.0 | 0.5926 | 0.5876 | 0.2913 | 0.3577 |
0.0465 | 87.0 | 9918 | 1.2037 | 0.3644 | 0.6841 | 0.6903 | nan | 0.7176 | 0.6150 | 0.9893 | 0.4146 | 0.0 | 0.5859 | 0.5856 | 0.2808 | 0.3697 |
0.0475 | 87.72 | 10000 | 1.2098 | 0.3648 | 0.6821 | 0.6947 | nan | 0.7354 | 0.6052 | 0.9884 | 0.3995 | 0.0 | 0.5920 | 0.5774 | 0.2950 | 0.3598 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 177
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.