classifier-de1
This model is a fine-tuned version of distilbert-base-german-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3485
- Accuracy: 0.8738
- Precision: 0.4859
- Recall: 0.3069
- F1: 0.3762
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
0.3406 | 0.0513 | 500 | 0.3753 | 0.8760 | 0.0 | 0.0 | 0.0 |
0.3251 | 0.1025 | 1000 | 0.3678 | 0.8760 | 0.0 | 0.0 | 0.0 |
0.2989 | 0.1538 | 1500 | 0.3666 | 0.8756 | 0.2806 | 0.0021 | 0.0042 |
0.2989 | 0.2050 | 2000 | 0.3648 | 0.8734 | 0.4034 | 0.0430 | 0.0776 |
0.2922 | 0.2563 | 2500 | 0.3626 | 0.8746 | 0.4528 | 0.0545 | 0.0973 |
0.2757 | 0.3075 | 3000 | 0.3647 | 0.8690 | 0.3960 | 0.1072 | 0.1687 |
0.29 | 0.3588 | 3500 | 0.3584 | 0.8706 | 0.4192 | 0.1139 | 0.1791 |
0.2587 | 0.4100 | 4000 | 0.3690 | 0.8707 | 0.4287 | 0.1275 | 0.1965 |
0.2654 | 0.4613 | 4500 | 0.3626 | 0.8705 | 0.4310 | 0.1387 | 0.2098 |
0.2658 | 0.5125 | 5000 | 0.3585 | 0.8758 | 0.4958 | 0.1114 | 0.1820 |
0.2523 | 0.5638 | 5500 | 0.3527 | 0.8725 | 0.4556 | 0.1445 | 0.2194 |
0.2621 | 0.6150 | 6000 | 0.3522 | 0.8750 | 0.4855 | 0.1308 | 0.2061 |
0.2501 | 0.6663 | 6500 | 0.3556 | 0.8594 | 0.3934 | 0.2469 | 0.3034 |
0.2318 | 0.7175 | 7000 | 0.3536 | 0.8771 | 0.5181 | 0.1297 | 0.2075 |
0.2362 | 0.7688 | 7500 | 0.3424 | 0.8776 | 0.5279 | 0.1201 | 0.1956 |
0.2351 | 0.8200 | 8000 | 0.3354 | 0.8731 | 0.4723 | 0.2014 | 0.2823 |
0.2153 | 0.8713 | 8500 | 0.3426 | 0.8775 | 0.5198 | 0.1573 | 0.2416 |
0.215 | 0.9225 | 9000 | 0.3384 | 0.8785 | 0.5416 | 0.1323 | 0.2127 |
0.2177 | 0.9738 | 9500 | 0.3353 | 0.8749 | 0.4891 | 0.2040 | 0.2879 |
0.2173 | 1.0250 | 10000 | 0.3303 | 0.8729 | 0.4737 | 0.2243 | 0.3044 |
0.2128 | 1.0763 | 10500 | 0.3363 | 0.8770 | 0.5125 | 0.1677 | 0.2527 |
0.2093 | 1.1275 | 11000 | 0.3354 | 0.8720 | 0.4693 | 0.2471 | 0.3238 |
0.2022 | 1.1788 | 11500 | 0.3349 | 0.8752 | 0.4929 | 0.2122 | 0.2967 |
0.1978 | 1.2300 | 12000 | 0.3382 | 0.8722 | 0.4700 | 0.2421 | 0.3196 |
0.1974 | 1.2813 | 12500 | 0.3265 | 0.8753 | 0.4930 | 0.1923 | 0.2767 |
0.2185 | 1.3325 | 13000 | 0.3458 | 0.8755 | 0.4951 | 0.2055 | 0.2904 |
0.1973 | 1.3838 | 13500 | 0.3472 | 0.8738 | 0.4824 | 0.2482 | 0.3278 |
0.1946 | 1.4350 | 14000 | 0.3367 | 0.8779 | 0.5203 | 0.1915 | 0.2799 |
0.1986 | 1.4863 | 14500 | 0.3394 | 0.8717 | 0.4704 | 0.2750 | 0.3471 |
0.1922 | 1.5375 | 15000 | 0.3310 | 0.8770 | 0.5090 | 0.2321 | 0.3188 |
0.1765 | 1.5888 | 15500 | 0.3584 | 0.8797 | 0.5454 | 0.1779 | 0.2682 |
0.2039 | 1.6400 | 16000 | 0.3279 | 0.8774 | 0.5128 | 0.2290 | 0.3166 |
0.2051 | 1.6913 | 16500 | 0.3302 | 0.8794 | 0.5376 | 0.1970 | 0.2883 |
0.1868 | 1.7425 | 17000 | 0.3222 | 0.8763 | 0.5021 | 0.2498 | 0.3336 |
0.1972 | 1.7938 | 17500 | 0.3296 | 0.8685 | 0.4564 | 0.3163 | 0.3737 |
0.1932 | 1.8450 | 18000 | 0.3185 | 0.8776 | 0.5136 | 0.2399 | 0.3270 |
0.1797 | 1.8963 | 18500 | 0.3231 | 0.8768 | 0.5064 | 0.2446 | 0.3298 |
0.1835 | 1.9475 | 19000 | 0.3230 | 0.8748 | 0.4913 | 0.2729 | 0.3509 |
0.1767 | 1.9988 | 19500 | 0.3286 | 0.8756 | 0.4970 | 0.2566 | 0.3385 |
0.192 | 2.0500 | 20000 | 0.3304 | 0.8781 | 0.5183 | 0.2405 | 0.3285 |
0.1795 | 2.1013 | 20500 | 0.3333 | 0.8793 | 0.5326 | 0.2145 | 0.3059 |
0.1716 | 2.1525 | 21000 | 0.3499 | 0.8760 | 0.4998 | 0.2685 | 0.3493 |
0.177 | 2.2038 | 21500 | 0.3329 | 0.8775 | 0.5127 | 0.2395 | 0.3265 |
0.1541 | 2.2550 | 22000 | 0.3323 | 0.8781 | 0.5182 | 0.2444 | 0.3321 |
0.1725 | 2.3063 | 22500 | 0.3384 | 0.8799 | 0.5423 | 0.2033 | 0.2958 |
0.182 | 2.3575 | 23000 | 0.3326 | 0.8777 | 0.5138 | 0.2551 | 0.3409 |
0.1575 | 2.4088 | 23500 | 0.3373 | 0.8781 | 0.5188 | 0.2381 | 0.3264 |
0.1735 | 2.4600 | 24000 | 0.3436 | 0.8795 | 0.5331 | 0.2280 | 0.3194 |
0.1545 | 2.5113 | 24500 | 0.3400 | 0.8804 | 0.5447 | 0.2180 | 0.3114 |
0.1592 | 2.5625 | 25000 | 0.3422 | 0.8790 | 0.5272 | 0.2348 | 0.3249 |
0.1395 | 2.6138 | 25500 | 0.3583 | 0.8796 | 0.5358 | 0.2177 | 0.3096 |
0.1543 | 2.6650 | 26000 | 0.3341 | 0.8791 | 0.5296 | 0.2257 | 0.3165 |
0.1811 | 2.7163 | 26500 | 0.3245 | 0.8764 | 0.5032 | 0.2790 | 0.3589 |
0.1564 | 2.7675 | 27000 | 0.3395 | 0.8789 | 0.5246 | 0.2485 | 0.3373 |
0.1585 | 2.8188 | 27500 | 0.3465 | 0.8787 | 0.5221 | 0.2571 | 0.3445 |
0.1642 | 2.8700 | 28000 | 0.3545 | 0.8811 | 0.5508 | 0.2230 | 0.3174 |
0.1633 | 2.9213 | 28500 | 0.3339 | 0.8755 | 0.4963 | 0.2942 | 0.3694 |
0.1663 | 2.9725 | 29000 | 0.3398 | 0.8781 | 0.5166 | 0.2682 | 0.3531 |
0.136 | 3.0238 | 29500 | 0.3607 | 0.8807 | 0.5466 | 0.2240 | 0.3178 |
0.1409 | 3.0750 | 30000 | 0.3660 | 0.8793 | 0.5304 | 0.2336 | 0.3244 |
0.1474 | 3.1263 | 30500 | 0.3519 | 0.8763 | 0.5026 | 0.2635 | 0.3457 |
0.1505 | 3.1775 | 31000 | 0.3485 | 0.8738 | 0.4859 | 0.3069 | 0.3762 |
0.133 | 3.2288 | 31500 | 0.3578 | 0.8797 | 0.5357 | 0.2263 | 0.3182 |
0.1438 | 3.2800 | 32000 | 0.3455 | 0.8758 | 0.4985 | 0.2839 | 0.3617 |
0.1591 | 3.3313 | 32500 | 0.3373 | 0.8749 | 0.4929 | 0.3033 | 0.3755 |
0.1738 | 3.3825 | 33000 | 0.3446 | 0.8781 | 0.5169 | 0.2656 | 0.3509 |
0.1683 | 3.4338 | 33500 | 0.3380 | 0.8776 | 0.5123 | 0.2721 | 0.3554 |
0.1567 | 3.4850 | 34000 | 0.3493 | 0.8799 | 0.5338 | 0.2481 | 0.3387 |
0.1388 | 3.5363 | 34500 | 0.3463 | 0.8791 | 0.5255 | 0.2557 | 0.3440 |
0.15 | 3.5875 | 35000 | 0.3391 | 0.8811 | 0.5454 | 0.2465 | 0.3396 |
0.1478 | 3.6388 | 35500 | 0.3465 | 0.8799 | 0.5327 | 0.2544 | 0.3444 |
0.1359 | 3.6900 | 36000 | 0.3705 | 0.8798 | 0.5321 | 0.2515 | 0.3416 |
0.1502 | 3.7413 | 36500 | 0.3386 | 0.8790 | 0.5236 | 0.2653 | 0.3522 |
0.1387 | 3.7925 | 37000 | 0.3514 | 0.8789 | 0.5227 | 0.2719 | 0.3577 |
0.1484 | 3.8438 | 37500 | 0.3391 | 0.8805 | 0.5432 | 0.2283 | 0.3215 |
0.154 | 3.8950 | 38000 | 0.3584 | 0.8807 | 0.5456 | 0.2259 | 0.3195 |
0.1395 | 3.9463 | 38500 | 0.3403 | 0.8779 | 0.5137 | 0.2804 | 0.3628 |
0.1429 | 3.9975 | 39000 | 0.3467 | 0.8783 | 0.5172 | 0.2747 | 0.3588 |
0.1278 | 4.0488 | 39500 | 0.3581 | 0.8793 | 0.5272 | 0.2609 | 0.3491 |
0.1582 | 4.1000 | 40000 | 0.3483 | 0.8783 | 0.5179 | 0.2719 | 0.3566 |
0.1174 | 4.1513 | 40500 | 0.3587 | 0.8794 | 0.5279 | 0.2604 | 0.3487 |
0.1363 | 4.2025 | 41000 | 0.3594 | 0.8800 | 0.5347 | 0.2514 | 0.3420 |
0.1361 | 4.2538 | 41500 | 0.3664 | 0.8806 | 0.5414 | 0.2426 | 0.3350 |
0.1299 | 4.3050 | 42000 | 0.3603 | 0.8792 | 0.5258 | 0.2606 | 0.3485 |
0.1443 | 4.3563 | 42500 | 0.3705 | 0.8796 | 0.5296 | 0.2616 | 0.3502 |
0.1417 | 4.4075 | 43000 | 0.3611 | 0.8800 | 0.5350 | 0.2455 | 0.3366 |
0.1354 | 4.4588 | 43500 | 0.3523 | 0.8792 | 0.5249 | 0.2735 | 0.3596 |
0.1474 | 4.5100 | 44000 | 0.3683 | 0.8812 | 0.5481 | 0.2384 | 0.3323 |
0.1398 | 4.5613 | 44500 | 0.3537 | 0.8800 | 0.5328 | 0.2599 | 0.3494 |
0.1558 | 4.6125 | 45000 | 0.3529 | 0.8804 | 0.5391 | 0.2466 | 0.3384 |
0.1479 | 4.6638 | 45500 | 0.3489 | 0.8794 | 0.5270 | 0.2640 | 0.3518 |
0.1454 | 4.7150 | 46000 | 0.3618 | 0.8798 | 0.5309 | 0.2620 | 0.3508 |
0.1327 | 4.7663 | 46500 | 0.3634 | 0.8807 | 0.5423 | 0.2444 | 0.3369 |
0.1427 | 4.8175 | 47000 | 0.3578 | 0.8784 | 0.5175 | 0.2836 | 0.3664 |
0.1361 | 4.8688 | 47500 | 0.3531 | 0.8794 | 0.5272 | 0.2693 | 0.3565 |
0.1303 | 4.9200 | 48000 | 0.3636 | 0.8789 | 0.5231 | 0.2627 | 0.3498 |
0.1373 | 4.9713 | 48500 | 0.3528 | 0.8791 | 0.5252 | 0.2628 | 0.3503 |
0.1339 | 5.0226 | 49000 | 0.3662 | 0.8795 | 0.5286 | 0.2631 | 0.3513 |
0.1449 | 5.0738 | 49500 | 0.3603 | 0.8773 | 0.5095 | 0.2778 | 0.3596 |
0.1295 | 5.1251 | 50000 | 0.3811 | 0.8795 | 0.5284 | 0.2616 | 0.3499 |
0.1372 | 5.1763 | 50500 | 0.3637 | 0.8769 | 0.5065 | 0.2885 | 0.3676 |
0.1381 | 5.2276 | 51000 | 0.3629 | 0.8784 | 0.5176 | 0.2833 | 0.3662 |
0.1334 | 5.2788 | 51500 | 0.3639 | 0.8788 | 0.5219 | 0.2672 | 0.3535 |
0.1422 | 5.3301 | 52000 | 0.3694 | 0.8779 | 0.5147 | 0.2729 | 0.3566 |
0.1413 | 5.3813 | 52500 | 0.3610 | 0.8773 | 0.5097 | 0.2822 | 0.3633 |
0.1487 | 5.4326 | 53000 | 0.3650 | 0.8778 | 0.5136 | 0.2736 | 0.3570 |
0.1431 | 5.4838 | 53500 | 0.3704 | 0.8797 | 0.5309 | 0.2567 | 0.3461 |
0.142 | 5.5351 | 54000 | 0.3637 | 0.8794 | 0.5278 | 0.2607 | 0.3490 |
0.1406 | 5.5863 | 54500 | 0.3670 | 0.8790 | 0.5243 | 0.2641 | 0.3512 |
0.1484 | 5.6376 | 55000 | 0.3608 | 0.8775 | 0.5109 | 0.2793 | 0.3612 |
0.1433 | 5.6888 | 55500 | 0.3652 | 0.8787 | 0.5211 | 0.2705 | 0.3562 |
0.1219 | 5.7401 | 56000 | 0.3655 | 0.8782 | 0.5165 | 0.2759 | 0.3597 |
0.1344 | 5.7913 | 56500 | 0.3662 | 0.8790 | 0.5242 | 0.2649 | 0.3519 |
0.1598 | 5.8426 | 57000 | 0.3684 | 0.8787 | 0.5208 | 0.2727 | 0.3580 |
0.1287 | 5.8938 | 57500 | 0.3659 | 0.8791 | 0.5240 | 0.2692 | 0.3556 |
0.1182 | 5.9451 | 58000 | 0.3671 | 0.8793 | 0.5263 | 0.2657 | 0.3531 |
0.1242 | 5.9963 | 58500 | 0.3650 | 0.8790 | 0.5234 | 0.2693 | 0.3556 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for egerber1/classifier-de1
Base model
distilbert/distilbert-base-german-cased