bart-with-woz-noise-data-0.1
This model is a fine-tuned version of facebook/bart-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0710
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.2068 | 0.04 | 500 | 0.1735 |
| 0.1785 | 0.09 | 1000 | 0.1508 |
| 0.2136 | 0.13 | 1500 | 0.1359 |
| 0.1249 | 0.18 | 2000 | 0.1281 |
| 0.1114 | 0.22 | 2500 | 0.1180 |
| 0.1327 | 0.26 | 3000 | 0.1153 |
| 0.1603 | 0.31 | 3500 | 0.1065 |
| 0.1422 | 0.35 | 4000 | 0.1032 |
| 0.1166 | 0.39 | 4500 | 0.1019 |
| 0.1266 | 0.44 | 5000 | 0.1001 |
| 0.1087 | 0.48 | 5500 | 0.0996 |
| 0.1284 | 0.53 | 6000 | 0.0967 |
| 0.0919 | 0.57 | 6500 | 0.0938 |
| 0.0924 | 0.61 | 7000 | 0.0927 |
| 0.1124 | 0.66 | 7500 | 0.0913 |
| 0.0843 | 0.7 | 8000 | 0.0920 |
| 0.1012 | 0.74 | 8500 | 0.0881 |
| 0.1058 | 0.79 | 9000 | 0.0867 |
| 0.0894 | 0.83 | 9500 | 0.0867 |
| 0.0858 | 0.88 | 10000 | 0.0828 |
| 0.0991 | 0.92 | 10500 | 0.0867 |
| 0.0471 | 0.96 | 11000 | 0.0867 |
| 0.0663 | 1.01 | 11500 | 0.0833 |
| 0.0743 | 1.05 | 12000 | 0.0843 |
| 0.0821 | 1.09 | 12500 | 0.0835 |
| 0.0826 | 1.14 | 13000 | 0.0812 |
| 0.0943 | 1.18 | 13500 | 0.0809 |
| 0.0708 | 1.23 | 14000 | 0.0813 |
| 0.0902 | 1.27 | 14500 | 0.0791 |
| 0.051 | 1.31 | 15000 | 0.0822 |
| 0.0782 | 1.36 | 15500 | 0.0800 |
| 0.0802 | 1.4 | 16000 | 0.0777 |
| 0.0671 | 1.44 | 16500 | 0.0787 |
| 0.0872 | 1.49 | 17000 | 0.0776 |
| 0.091 | 1.53 | 17500 | 0.0766 |
| 0.0722 | 1.58 | 18000 | 0.0775 |
| 0.0539 | 1.62 | 18500 | 0.0754 |
| 0.067 | 1.66 | 19000 | 0.0754 |
| 0.0372 | 1.71 | 19500 | 0.0758 |
| 0.0838 | 1.75 | 20000 | 0.0763 |
| 0.0496 | 1.79 | 20500 | 0.0736 |
| 0.0542 | 1.84 | 21000 | 0.0744 |
| 0.0435 | 1.88 | 21500 | 0.0746 |
| 0.0568 | 1.93 | 22000 | 0.0731 |
| 0.0521 | 1.97 | 22500 | 0.0713 |
| 0.0377 | 2.01 | 23000 | 0.0743 |
| 0.0277 | 2.06 | 23500 | 0.0747 |
| 0.0587 | 2.1 | 24000 | 0.0742 |
| 0.0345 | 2.14 | 24500 | 0.0748 |
| 0.0364 | 2.19 | 25000 | 0.0761 |
| 0.0524 | 2.23 | 25500 | 0.0737 |
| 0.0407 | 2.28 | 26000 | 0.0736 |
| 0.0425 | 2.32 | 26500 | 0.0730 |
| 0.044 | 2.36 | 27000 | 0.0734 |
| 0.0477 | 2.41 | 27500 | 0.0731 |
| 0.0382 | 2.45 | 28000 | 0.0732 |
| 0.0387 | 2.5 | 28500 | 0.0726 |
| 0.0459 | 2.54 | 29000 | 0.0731 |
| 0.0554 | 2.58 | 29500 | 0.0720 |
| 0.0348 | 2.63 | 30000 | 0.0727 |
| 0.0449 | 2.67 | 30500 | 0.0717 |
| 0.0386 | 2.71 | 31000 | 0.0720 |
| 0.0436 | 2.76 | 31500 | 0.0712 |
| 0.0345 | 2.8 | 32000 | 0.0720 |
| 0.0509 | 2.85 | 32500 | 0.0712 |
| 0.0402 | 2.89 | 33000 | 0.0710 |
| 0.055 | 2.93 | 33500 | 0.0711 |
| 0.0413 | 2.98 | 34000 | 0.0710 |
Framework versions
- Transformers 4.37.2
- Pytorch 2.1.2+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for gayanin/bart-with-woz-noise-data-0.1
Base model
facebook/bart-base