rbelanec commited on
Commit
5a8a14d
verified
1 Parent(s): d99111e

Model save

Browse files
Files changed (2) hide show
  1. README.md +262 -0
  2. adapter_model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,262 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: gemma
4
+ base_model: google/gemma-3-1b-it
5
+ tags:
6
+ - llama-factory
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: train_mrpc_1744902643
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # train_mrpc_1744902643
17
+
18
+ This model is a fine-tuned version of [google/gemma-3-1b-it](https://huggingface.co/google/gemma-3-1b-it) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.1949
21
+ - Num Input Tokens Seen: 68544800
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 0.3
41
+ - train_batch_size: 4
42
+ - eval_batch_size: 4
43
+ - seed: 123
44
+ - gradient_accumulation_steps: 4
45
+ - total_train_batch_size: 16
46
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
+ - lr_scheduler_type: cosine
48
+ - training_steps: 40000
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
53
+ |:-------------:|:--------:|:-----:|:---------------:|:-----------------:|
54
+ | 0.161 | 0.9685 | 200 | 0.1907 | 342592 |
55
+ | 0.1822 | 1.9395 | 400 | 0.1594 | 685504 |
56
+ | 0.1611 | 2.9104 | 600 | 0.1662 | 1027680 |
57
+ | 0.1791 | 3.8814 | 800 | 0.1607 | 1371040 |
58
+ | 0.1444 | 4.8523 | 1000 | 0.1527 | 1713440 |
59
+ | 0.1477 | 5.8232 | 1200 | 0.1597 | 2056384 |
60
+ | 0.1451 | 6.7942 | 1400 | 0.1569 | 2400544 |
61
+ | 0.1515 | 7.7651 | 1600 | 0.1735 | 2741344 |
62
+ | 0.1643 | 8.7361 | 1800 | 0.1636 | 3083872 |
63
+ | 0.1533 | 9.7070 | 2000 | 0.1658 | 3425696 |
64
+ | 0.1429 | 10.6780 | 2200 | 0.1588 | 3769888 |
65
+ | 0.1556 | 11.6489 | 2400 | 0.1824 | 4110336 |
66
+ | 0.1218 | 12.6199 | 2600 | 0.1686 | 4453600 |
67
+ | 0.1527 | 13.5908 | 2800 | 0.1947 | 4796192 |
68
+ | 0.0872 | 14.5617 | 3000 | 0.1860 | 5138720 |
69
+ | 0.1445 | 15.5327 | 3200 | 0.2100 | 5480512 |
70
+ | 0.1433 | 16.5036 | 3400 | 0.1877 | 5822816 |
71
+ | 0.1014 | 17.4746 | 3600 | 0.2470 | 6165056 |
72
+ | 0.1101 | 18.4455 | 3800 | 0.2416 | 6507264 |
73
+ | 0.0521 | 19.4165 | 4000 | 0.2813 | 6849792 |
74
+ | 0.0438 | 20.3874 | 4200 | 0.2733 | 7192864 |
75
+ | 0.1356 | 21.3584 | 4400 | 0.3191 | 7534272 |
76
+ | 0.0441 | 22.3293 | 4600 | 0.3034 | 7877248 |
77
+ | 0.041 | 23.3002 | 4800 | 0.3460 | 8220544 |
78
+ | 0.024 | 24.2712 | 5000 | 0.3483 | 8562144 |
79
+ | 0.0235 | 25.2421 | 5200 | 0.3925 | 8905568 |
80
+ | 0.0111 | 26.2131 | 5400 | 0.3959 | 9248640 |
81
+ | 0.0355 | 27.1840 | 5600 | 0.3593 | 9592608 |
82
+ | 0.0166 | 28.1550 | 5800 | 0.3473 | 9933568 |
83
+ | 0.0216 | 29.1259 | 6000 | 0.3824 | 10277088 |
84
+ | 0.0187 | 30.0969 | 6200 | 0.4093 | 10619488 |
85
+ | 0.0332 | 31.0678 | 6400 | 0.3896 | 10962112 |
86
+ | 0.0287 | 32.0387 | 6600 | 0.3759 | 11306080 |
87
+ | 0.0515 | 33.0097 | 6800 | 0.3481 | 11649024 |
88
+ | 0.0241 | 33.9782 | 7000 | 0.4164 | 11992032 |
89
+ | 0.023 | 34.9492 | 7200 | 0.3968 | 12334784 |
90
+ | 0.0699 | 35.9201 | 7400 | 0.3400 | 12677888 |
91
+ | 0.0311 | 36.8910 | 7600 | 0.3645 | 13020640 |
92
+ | 0.0095 | 37.8620 | 7800 | 0.4563 | 13363648 |
93
+ | 0.0138 | 38.8329 | 8000 | 0.4573 | 13706752 |
94
+ | 0.0358 | 39.8039 | 8200 | 0.3834 | 14048256 |
95
+ | 0.0293 | 40.7748 | 8400 | 0.3833 | 14392064 |
96
+ | 0.0119 | 41.7458 | 8600 | 0.4786 | 14733504 |
97
+ | 0.001 | 42.7167 | 8800 | 0.5092 | 15076736 |
98
+ | 0.0036 | 43.6877 | 9000 | 0.4884 | 15418176 |
99
+ | 0.0258 | 44.6586 | 9200 | 0.5242 | 15762912 |
100
+ | 0.0188 | 45.6295 | 9400 | 0.4023 | 16105760 |
101
+ | 0.0254 | 46.6005 | 9600 | 0.4033 | 16448096 |
102
+ | 0.0109 | 47.5714 | 9800 | 0.4908 | 16790336 |
103
+ | 0.0044 | 48.5424 | 10000 | 0.4366 | 17132896 |
104
+ | 0.0034 | 49.5133 | 10200 | 0.4603 | 17477376 |
105
+ | 0.007 | 50.4843 | 10400 | 0.5221 | 17817792 |
106
+ | 0.003 | 51.4552 | 10600 | 0.5243 | 18160384 |
107
+ | 0.0013 | 52.4262 | 10800 | 0.6114 | 18502784 |
108
+ | 0.0128 | 53.3971 | 11000 | 0.5867 | 18845184 |
109
+ | 0.0076 | 54.3680 | 11200 | 0.5882 | 19187296 |
110
+ | 0.042 | 55.3390 | 11400 | 0.4477 | 19529792 |
111
+ | 0.0147 | 56.3099 | 11600 | 0.5452 | 19873728 |
112
+ | 0.0166 | 57.2809 | 11800 | 0.5037 | 20215680 |
113
+ | 0.0022 | 58.2518 | 12000 | 0.5719 | 20558624 |
114
+ | 0.0004 | 59.2228 | 12200 | 0.5728 | 20901984 |
115
+ | 0.0018 | 60.1937 | 12400 | 0.6163 | 21244800 |
116
+ | 0.0004 | 61.1646 | 12600 | 0.6409 | 21588704 |
117
+ | 0.0001 | 62.1356 | 12800 | 0.6553 | 21931872 |
118
+ | 0.0001 | 63.1065 | 13000 | 0.6692 | 22274560 |
119
+ | 0.0 | 64.0775 | 13200 | 0.6789 | 22618432 |
120
+ | 0.0 | 65.0484 | 13400 | 0.6877 | 22961216 |
121
+ | 0.0 | 66.0194 | 13600 | 0.6961 | 23304288 |
122
+ | 0.0 | 66.9879 | 13800 | 0.7061 | 23646592 |
123
+ | 0.0 | 67.9588 | 14000 | 0.7122 | 23989408 |
124
+ | 0.0 | 68.9298 | 14200 | 0.7210 | 24332544 |
125
+ | 0.0 | 69.9007 | 14400 | 0.7238 | 24675424 |
126
+ | 0.0 | 70.8717 | 14600 | 0.7328 | 25017632 |
127
+ | 0.0 | 71.8426 | 14800 | 0.7408 | 25360352 |
128
+ | 0.0 | 72.8136 | 15000 | 0.7473 | 25701344 |
129
+ | 0.0 | 73.7845 | 15200 | 0.7552 | 26046016 |
130
+ | 0.0 | 74.7554 | 15400 | 0.7630 | 26388448 |
131
+ | 0.0 | 75.7264 | 15600 | 0.7694 | 26729856 |
132
+ | 0.0 | 76.6973 | 15800 | 0.7714 | 27072064 |
133
+ | 0.0 | 77.6683 | 16000 | 0.7861 | 27415968 |
134
+ | 0.0 | 78.6392 | 16200 | 0.7885 | 27759520 |
135
+ | 0.0 | 79.6102 | 16400 | 0.7974 | 28101632 |
136
+ | 0.0 | 80.5811 | 16600 | 0.8027 | 28446208 |
137
+ | 0.0 | 81.5521 | 16800 | 0.8075 | 28787840 |
138
+ | 0.0 | 82.5230 | 17000 | 0.8158 | 29129536 |
139
+ | 0.0 | 83.4939 | 17200 | 0.8239 | 29473344 |
140
+ | 0.0 | 84.4649 | 17400 | 0.8303 | 29815360 |
141
+ | 0.0 | 85.4358 | 17600 | 0.8376 | 30157632 |
142
+ | 0.0 | 86.4068 | 17800 | 0.8439 | 30501440 |
143
+ | 0.0 | 87.3777 | 18000 | 0.8497 | 30843072 |
144
+ | 0.0 | 88.3487 | 18200 | 0.8595 | 31187360 |
145
+ | 0.0 | 89.3196 | 18400 | 0.8655 | 31528480 |
146
+ | 0.0 | 90.2906 | 18600 | 0.8731 | 31872544 |
147
+ | 0.0 | 91.2615 | 18800 | 0.8824 | 32214560 |
148
+ | 0.0 | 92.2324 | 19000 | 0.8885 | 32558112 |
149
+ | 0.0 | 93.2034 | 19200 | 0.8940 | 32900448 |
150
+ | 0.0 | 94.1743 | 19400 | 0.9026 | 33244800 |
151
+ | 0.0 | 95.1453 | 19600 | 0.9150 | 33587168 |
152
+ | 0.0 | 96.1162 | 19800 | 0.9225 | 33929248 |
153
+ | 0.0 | 97.0872 | 20000 | 0.9254 | 34271648 |
154
+ | 0.0 | 98.0581 | 20200 | 0.9303 | 34613344 |
155
+ | 0.0 | 99.0291 | 20400 | 0.9378 | 34957056 |
156
+ | 0.0 | 99.9976 | 20600 | 0.9486 | 35299200 |
157
+ | 0.0 | 100.9685 | 20800 | 0.9553 | 35642464 |
158
+ | 0.0 | 101.9395 | 21000 | 0.9646 | 35985280 |
159
+ | 0.0 | 102.9104 | 21200 | 0.9644 | 36327840 |
160
+ | 0.0 | 103.8814 | 21400 | 0.9756 | 36669664 |
161
+ | 0.0 | 104.8523 | 21600 | 0.9847 | 37012960 |
162
+ | 0.0 | 105.8232 | 21800 | 0.9931 | 37355968 |
163
+ | 0.0 | 106.7942 | 22000 | 1.0025 | 37698112 |
164
+ | 0.0 | 107.7651 | 22200 | 1.0076 | 38040768 |
165
+ | 0.0 | 108.7361 | 22400 | 1.0138 | 38383744 |
166
+ | 0.0 | 109.7070 | 22600 | 1.0152 | 38726880 |
167
+ | 0.0 | 110.6780 | 22800 | 1.0302 | 39068512 |
168
+ | 0.0 | 111.6489 | 23000 | 1.0302 | 39411712 |
169
+ | 0.0 | 112.6199 | 23200 | 1.0323 | 39754784 |
170
+ | 0.0 | 113.5908 | 23400 | 1.0441 | 40097568 |
171
+ | 0.0 | 114.5617 | 23600 | 1.0523 | 40441152 |
172
+ | 0.0 | 115.5327 | 23800 | 1.0557 | 40784672 |
173
+ | 0.0 | 116.5036 | 24000 | 1.0628 | 41127232 |
174
+ | 0.0 | 117.4746 | 24200 | 1.0709 | 41468768 |
175
+ | 0.0 | 118.4455 | 24400 | 1.0707 | 41811328 |
176
+ | 0.0 | 119.4165 | 24600 | 1.0784 | 42154688 |
177
+ | 0.0 | 120.3874 | 24800 | 1.0863 | 42497024 |
178
+ | 0.0 | 121.3584 | 25000 | 1.0897 | 42838112 |
179
+ | 0.0 | 122.3293 | 25200 | 1.0971 | 43181600 |
180
+ | 0.0 | 123.3002 | 25400 | 1.0959 | 43524256 |
181
+ | 0.0 | 124.2712 | 25600 | 1.0986 | 43867840 |
182
+ | 0.0 | 125.2421 | 25800 | 1.1098 | 44207680 |
183
+ | 0.0 | 126.2131 | 26000 | 1.1184 | 44551232 |
184
+ | 0.0 | 127.1840 | 26200 | 1.1156 | 44894816 |
185
+ | 0.0 | 128.1550 | 26400 | 1.1208 | 45236928 |
186
+ | 0.0 | 129.1259 | 26600 | 1.1199 | 45579584 |
187
+ | 0.0 | 130.0969 | 26800 | 1.1278 | 45923328 |
188
+ | 0.0 | 131.0678 | 27000 | 1.1318 | 46264032 |
189
+ | 0.0 | 132.0387 | 27200 | 1.1411 | 46607776 |
190
+ | 0.0 | 133.0097 | 27400 | 1.1394 | 46950752 |
191
+ | 0.0 | 133.9782 | 27600 | 1.1429 | 47293824 |
192
+ | 0.0 | 134.9492 | 27800 | 1.1487 | 47637248 |
193
+ | 0.0 | 135.9201 | 28000 | 1.1461 | 47979552 |
194
+ | 0.0 | 136.8910 | 28200 | 1.1441 | 48322528 |
195
+ | 0.0 | 137.8620 | 28400 | 1.1561 | 48663488 |
196
+ | 0.0 | 138.8329 | 28600 | 1.1517 | 49008000 |
197
+ | 0.0 | 139.8039 | 28800 | 1.1507 | 49350304 |
198
+ | 0.0 | 140.7748 | 29000 | 1.1619 | 49694528 |
199
+ | 0.0 | 141.7458 | 29200 | 1.1625 | 50035616 |
200
+ | 0.0 | 142.7167 | 29400 | 1.1646 | 50378912 |
201
+ | 0.0 | 143.6877 | 29600 | 1.1668 | 50722400 |
202
+ | 0.0 | 144.6586 | 29800 | 1.1662 | 51064768 |
203
+ | 0.0 | 145.6295 | 30000 | 1.1683 | 51407840 |
204
+ | 0.0 | 146.6005 | 30200 | 1.1743 | 51749792 |
205
+ | 0.0 | 147.5714 | 30400 | 1.1748 | 52094304 |
206
+ | 0.0 | 148.5424 | 30600 | 1.1746 | 52436000 |
207
+ | 0.0 | 149.5133 | 30800 | 1.1759 | 52777984 |
208
+ | 0.0 | 150.4843 | 31000 | 1.1771 | 53119904 |
209
+ | 0.0 | 151.4552 | 31200 | 1.1726 | 53462560 |
210
+ | 0.0 | 152.4262 | 31400 | 1.1877 | 53806272 |
211
+ | 0.0 | 153.3971 | 31600 | 1.1818 | 54148640 |
212
+ | 0.0 | 154.3680 | 31800 | 1.1787 | 54489984 |
213
+ | 0.0 | 155.3390 | 32000 | 1.1769 | 54832032 |
214
+ | 0.0 | 156.3099 | 32200 | 1.1839 | 55173664 |
215
+ | 0.0 | 157.2809 | 32400 | 1.1904 | 55517376 |
216
+ | 0.0 | 158.2518 | 32600 | 1.1881 | 55861088 |
217
+ | 0.0 | 159.2228 | 32800 | 1.1936 | 56203392 |
218
+ | 0.0 | 160.1937 | 33000 | 1.1827 | 56545632 |
219
+ | 0.0 | 161.1646 | 33200 | 1.1905 | 56888352 |
220
+ | 0.0 | 162.1356 | 33400 | 1.1879 | 57231584 |
221
+ | 0.0 | 163.1065 | 33600 | 1.1913 | 57574112 |
222
+ | 0.0 | 164.0775 | 33800 | 1.1953 | 57917728 |
223
+ | 0.0 | 165.0484 | 34000 | 1.1933 | 58261184 |
224
+ | 0.0 | 166.0194 | 34200 | 1.1970 | 58604352 |
225
+ | 0.0 | 166.9879 | 34400 | 1.1914 | 58946112 |
226
+ | 0.0 | 167.9588 | 34600 | 1.1899 | 59289344 |
227
+ | 0.0 | 168.9298 | 34800 | 1.1923 | 59631584 |
228
+ | 0.0 | 169.9007 | 35000 | 1.1975 | 59974880 |
229
+ | 0.0 | 170.8717 | 35200 | 1.1925 | 60318560 |
230
+ | 0.0 | 171.8426 | 35400 | 1.1931 | 60662016 |
231
+ | 0.0 | 172.8136 | 35600 | 1.1948 | 61004352 |
232
+ | 0.0 | 173.7845 | 35800 | 1.1955 | 61347296 |
233
+ | 0.0 | 174.7554 | 36000 | 1.1908 | 61689824 |
234
+ | 0.0 | 175.7264 | 36200 | 1.1990 | 62033792 |
235
+ | 0.0 | 176.6973 | 36400 | 1.1992 | 62376224 |
236
+ | 0.0 | 177.6683 | 36600 | 1.1940 | 62720096 |
237
+ | 0.0 | 178.6392 | 36800 | 1.1968 | 63062656 |
238
+ | 0.0 | 179.6102 | 37000 | 1.1963 | 63405504 |
239
+ | 0.0 | 180.5811 | 37200 | 1.1959 | 63748768 |
240
+ | 0.0 | 181.5521 | 37400 | 1.1894 | 64092416 |
241
+ | 0.0 | 182.5230 | 37600 | 1.1997 | 64436992 |
242
+ | 0.0 | 183.4939 | 37800 | 1.1968 | 64777984 |
243
+ | 0.0 | 184.4649 | 38000 | 1.1958 | 65120224 |
244
+ | 0.0 | 185.4358 | 38200 | 1.1940 | 65462240 |
245
+ | 0.0 | 186.4068 | 38400 | 1.1946 | 65805504 |
246
+ | 0.0 | 187.3777 | 38600 | 1.1984 | 66148448 |
247
+ | 0.0 | 188.3487 | 38800 | 1.1946 | 66490240 |
248
+ | 0.0 | 189.3196 | 39000 | 1.1953 | 66832256 |
249
+ | 0.0 | 190.2906 | 39200 | 1.1945 | 67174336 |
250
+ | 0.0 | 191.2615 | 39400 | 1.2012 | 67517920 |
251
+ | 0.0 | 192.2324 | 39600 | 1.2013 | 67860384 |
252
+ | 0.0 | 193.2034 | 39800 | 1.1949 | 68203104 |
253
+ | 0.0 | 194.1743 | 40000 | 1.1949 | 68544800 |
254
+
255
+
256
+ ### Framework versions
257
+
258
+ - PEFT 0.15.1
259
+ - Transformers 4.51.3
260
+ - Pytorch 2.6.0+cu124
261
+ - Datasets 3.5.0
262
+ - Tokenizers 0.21.1
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ce0a6b5f6498d58bc71192d05ec18b2d0e8aa62a4e47f97c5dff64f06a69cc83
3
  size 460928
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b8a5d29811e3aa9a495ea8ac6300d4dc8bf30206dc3c5a556cd69409631a23e3
3
  size 460928