--- library_name: transformers license: mit base_model: roberta-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: cwe-parent-vulnerability-classification-roberta-base-roberta-base results: [] --- # cwe-parent-vulnerability-classification-roberta-base-roberta-base This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.8770 - Accuracy: 0.3704 - F1 Macro: 0.2104 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:| | 3.2548 | 1.0 | 22 | 3.2008 | 0.0247 | 0.0032 | | 3.2101 | 2.0 | 44 | 3.1368 | 0.2469 | 0.0390 | | 3.1235 | 3.0 | 66 | 3.1592 | 0.3086 | 0.0470 | | 3.1517 | 4.0 | 88 | 3.1942 | 0.0741 | 0.0306 | | 3.1203 | 5.0 | 110 | 3.1893 | 0.0741 | 0.0236 | | 3.052 | 6.0 | 132 | 3.2068 | 0.1111 | 0.0506 | | 2.9901 | 7.0 | 154 | 3.2085 | 0.0864 | 0.0450 | | 2.9408 | 8.0 | 176 | 3.1076 | 0.1605 | 0.0837 | | 2.9616 | 9.0 | 198 | 3.1395 | 0.2840 | 0.1093 | | 2.6981 | 10.0 | 220 | 3.0276 | 0.1235 | 0.0822 | | 2.5881 | 11.0 | 242 | 2.9858 | 0.3086 | 0.1426 | | 2.4502 | 12.0 | 264 | 3.0535 | 0.2963 | 0.1760 | | 2.3384 | 13.0 | 286 | 2.9500 | 0.2840 | 0.1541 | | 2.3099 | 14.0 | 308 | 2.9306 | 0.2593 | 0.1812 | | 2.1734 | 15.0 | 330 | 2.9583 | 0.3086 | 0.1412 | | 2.0758 | 16.0 | 352 | 2.9464 | 0.2840 | 0.1504 | | 1.9912 | 17.0 | 374 | 2.9119 | 0.3210 | 0.1949 | | 1.8726 | 18.0 | 396 | 2.9168 | 0.3210 | 0.1794 | | 1.8145 | 19.0 | 418 | 2.9360 | 0.2963 | 0.1724 | | 1.6758 | 20.0 | 440 | 2.9125 | 0.3333 | 0.1914 | | 1.5863 | 21.0 | 462 | 2.9420 | 0.3457 | 0.2171 | | 1.5365 | 22.0 | 484 | 2.9001 | 0.3580 | 0.2316 | | 1.4698 | 23.0 | 506 | 2.8783 | 0.3457 | 0.2107 | | 1.4471 | 24.0 | 528 | 2.9298 | 0.3580 | 0.2286 | | 1.3445 | 25.0 | 550 | 2.8971 | 0.3580 | 0.2178 | | 1.3723 | 26.0 | 572 | 2.8770 | 0.3704 | 0.2104 | | 1.1981 | 27.0 | 594 | 2.9112 | 0.3704 | 0.2195 | | 1.279 | 28.0 | 616 | 2.9038 | 0.3580 | 0.2278 | | 1.1505 | 29.0 | 638 | 2.9192 | 0.3704 | 0.2269 | | 1.1089 | 30.0 | 660 | 2.9398 | 0.3704 | 0.2228 | | 1.0631 | 31.0 | 682 | 2.9589 | 0.3704 | 0.2292 | | 1.0373 | 32.0 | 704 | 2.9136 | 0.3704 | 0.2106 | | 0.9814 | 33.0 | 726 | 2.9551 | 0.3457 | 0.2155 | | 1.0372 | 34.0 | 748 | 2.9457 | 0.3704 | 0.2094 | | 0.9644 | 35.0 | 770 | 2.9645 | 0.3827 | 0.2269 | | 1.0171 | 36.0 | 792 | 2.9565 | 0.3704 | 0.2317 | | 0.9021 | 37.0 | 814 | 2.9583 | 0.3951 | 0.2400 | | 0.9202 | 38.0 | 836 | 2.9742 | 0.4074 | 0.2458 | | 0.9314 | 39.0 | 858 | 2.9691 | 0.3951 | 0.2349 | | 0.9293 | 40.0 | 880 | 2.9746 | 0.3951 | 0.2349 | ### Framework versions - Transformers 4.56.1 - Pytorch 2.8.0+cu128 - Datasets 4.0.0 - Tokenizers 0.22.0