--- library_name: transformers license: mit base_model: roberta-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: cwe-parent-vulnerability-classification-roberta-base results: [] --- # cwe-parent-vulnerability-classification-roberta-base This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.5419 - Accuracy: 0.8068 - F1 Macro: 0.3782 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:| | 3.2546 | 1.0 | 25 | 3.1526 | 0.0682 | 0.0142 | | 3.1769 | 2.0 | 50 | 2.9132 | 0.0795 | 0.0185 | | 3.1105 | 3.0 | 75 | 2.8945 | 0.5 | 0.0752 | | 3.0319 | 4.0 | 100 | 2.7792 | 0.0114 | 0.0027 | | 3.0703 | 5.0 | 125 | 2.8108 | 0.0455 | 0.0130 | | 2.9543 | 6.0 | 150 | 2.8282 | 0.0682 | 0.0235 | | 2.9445 | 7.0 | 175 | 2.7967 | 0.125 | 0.0743 | | 2.8502 | 8.0 | 200 | 2.7834 | 0.1818 | 0.0913 | | 2.7131 | 9.0 | 225 | 2.6824 | 0.4205 | 0.1350 | | 2.6082 | 10.0 | 250 | 2.6462 | 0.5227 | 0.1447 | | 2.6341 | 11.0 | 275 | 2.6106 | 0.4545 | 0.1765 | | 2.3253 | 12.0 | 300 | 2.5876 | 0.5455 | 0.1611 | | 2.2511 | 13.0 | 325 | 2.6247 | 0.4773 | 0.1580 | | 2.2056 | 14.0 | 350 | 2.5191 | 0.6136 | 0.1808 | | 2.0981 | 15.0 | 375 | 2.3928 | 0.625 | 0.2183 | | 1.8694 | 16.0 | 400 | 2.3963 | 0.6818 | 0.2353 | | 1.7945 | 17.0 | 425 | 2.2359 | 0.6818 | 0.2290 | | 1.7152 | 18.0 | 450 | 2.2076 | 0.7159 | 0.2540 | | 1.6186 | 19.0 | 475 | 2.1035 | 0.7045 | 0.2388 | | 1.4477 | 20.0 | 500 | 2.0271 | 0.6932 | 0.2464 | | 1.4064 | 21.0 | 525 | 1.9818 | 0.7159 | 0.2478 | | 1.2211 | 22.0 | 550 | 1.8832 | 0.7159 | 0.2517 | | 1.2831 | 23.0 | 575 | 1.8892 | 0.7273 | 0.2712 | | 1.1426 | 24.0 | 600 | 1.7992 | 0.7273 | 0.2644 | | 1.054 | 25.0 | 625 | 1.8517 | 0.7386 | 0.2726 | | 1.0345 | 26.0 | 650 | 1.7283 | 0.7273 | 0.2644 | | 0.9516 | 27.0 | 675 | 1.7043 | 0.7045 | 0.2630 | | 0.8861 | 28.0 | 700 | 1.6532 | 0.7386 | 0.2735 | | 0.8477 | 29.0 | 725 | 1.6508 | 0.7614 | 0.2795 | | 0.8804 | 30.0 | 750 | 1.6057 | 0.7386 | 0.2612 | | 0.7854 | 31.0 | 775 | 1.5771 | 0.75 | 0.2902 | | 0.7311 | 32.0 | 800 | 1.5838 | 0.7614 | 0.2662 | | 0.7362 | 33.0 | 825 | 1.5649 | 0.7841 | 0.3463 | | 0.7031 | 34.0 | 850 | 1.5553 | 0.7841 | 0.3361 | | 0.7589 | 35.0 | 875 | 1.5546 | 0.7955 | 0.3545 | | 0.6877 | 36.0 | 900 | 1.5557 | 0.7841 | 0.3361 | | 0.6497 | 37.0 | 925 | 1.5419 | 0.8068 | 0.3782 | | 0.6565 | 38.0 | 950 | 1.5496 | 0.7955 | 0.3663 | | 0.6333 | 39.0 | 975 | 1.5531 | 0.7955 | 0.3458 | | 0.653 | 40.0 | 1000 | 1.5501 | 0.7955 | 0.3458 | ### Framework versions - Transformers 4.55.4 - Pytorch 2.7.1+cu126 - Datasets 4.0.0 - Tokenizers 0.21.2