a3470200b5880cada2b4db84cc65881b
This model is a fine-tuned version of distilbert/distilbert-base-cased on the dim/tldr_news dataset. It achieves the following results on the evaluation set:
- Loss: 1.2736
 - Data Size: 1.0
 - Epoch Runtime: 6.3421
 - Accuracy: 0.7607
 - F1 Macro: 0.8046
 - Rouge1: 0.7610
 - Rouge2: 0.0
 - Rougel: 0.7607
 - Rougelsum: 0.7607
 
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
 - train_batch_size: 8
 - eval_batch_size: 8
 - seed: 42
 - distributed_type: multi-GPU
 - num_devices: 4
 - total_train_batch_size: 32
 - total_eval_batch_size: 32
 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
 - lr_scheduler_type: constant
 - num_epochs: 50
 
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum | 
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6508 | 0 | 0.9951 | 0.1946 | 0.0928 | 0.1946 | 0.0 | 0.1960 | 0.1946 | 
| No log | 1 | 178 | 1.5589 | 0.0078 | 2.4032 | 0.2401 | 0.0774 | 0.2401 | 0.0 | 0.2408 | 0.2393 | 
| No log | 2 | 356 | 1.4107 | 0.0156 | 1.3138 | 0.4624 | 0.2648 | 0.4624 | 0.0 | 0.4624 | 0.4624 | 
| No log | 3 | 534 | 1.1003 | 0.0312 | 1.5566 | 0.6300 | 0.4925 | 0.6300 | 0.0 | 0.6314 | 0.6293 | 
| No log | 4 | 712 | 0.8489 | 0.0625 | 1.8027 | 0.6918 | 0.5311 | 0.6932 | 0.0 | 0.6918 | 0.6925 | 
| No log | 5 | 890 | 0.7869 | 0.125 | 2.0470 | 0.7031 | 0.5414 | 0.7045 | 0.0 | 0.7038 | 0.7031 | 
| 0.0575 | 6 | 1068 | 0.6957 | 0.25 | 2.6734 | 0.7195 | 0.5777 | 0.7195 | 0.0 | 0.7202 | 0.7195 | 
| 0.5671 | 7 | 1246 | 0.6207 | 0.5 | 3.8896 | 0.7557 | 0.7406 | 0.7564 | 0.0 | 0.7557 | 0.7557 | 
| 0.475 | 8.0 | 1424 | 0.6020 | 1.0 | 6.2343 | 0.7557 | 0.7659 | 0.7564 | 0.0 | 0.7557 | 0.7557 | 
| 0.3048 | 9.0 | 1602 | 0.7315 | 1.0 | 5.9812 | 0.7528 | 0.7910 | 0.7536 | 0.0 | 0.7536 | 0.7528 | 
| 0.1719 | 10.0 | 1780 | 0.8484 | 1.0 | 6.3332 | 0.7557 | 0.7950 | 0.7564 | 0.0 | 0.7560 | 0.7557 | 
| 0.1121 | 11.0 | 1958 | 1.0948 | 1.0 | 6.2567 | 0.7614 | 0.7879 | 0.7621 | 0.0 | 0.7624 | 0.7607 | 
| 0.0738 | 12.0 | 2136 | 1.2736 | 1.0 | 6.3421 | 0.7607 | 0.8046 | 0.7610 | 0.0 | 0.7607 | 0.7607 | 
Framework versions
- Transformers 4.57.0
 - Pytorch 2.8.0+cu128
 - Datasets 4.0.0
 - Tokenizers 0.22.1
 
- Downloads last month
 - 6
 
Model tree for contemmcm/a3470200b5880cada2b4db84cc65881b
Base model
distilbert/distilbert-base-cased