Set decoder_start_token_id and output_past in config
Browse filesWithout the `decoder_start_token_id` parameter, you get the following `ValueError` while using the model:
```
561 elif (
562 hasattr(self.config, "decoder")
563 and hasattr(self.config.decoder, "bos_token_id")
564 and self.config.decoder.bos_token_id is not None
565 ):
566 return self.config.decoder.bos_token_id
--> 567 raise ValueError(
568 "`decoder_start_token_id` or `bos_token_id` has to be defined for encoder-decoder generation."
569 )
ValueError: `decoder_start_token_id` or `bos_token_id` has to be defined for encoder-decoder generation.
```
I checked https://huggingface.co/google/flan-t5-large/blob/main/config.json and noticed that `output_past` is also different.
- config.json +2 -0
config.json
CHANGED
|
@@ -5,6 +5,7 @@
|
|
| 5 |
"d_ff": 10240,
|
| 6 |
"d_kv": 64,
|
| 7 |
"d_model": 4096,
|
|
|
|
| 8 |
"dense_act_fn": "gelu",
|
| 9 |
"dropout_rate": 0.1,
|
| 10 |
"eos_token_id": 1,
|
|
@@ -17,6 +18,7 @@
|
|
| 17 |
"num_decoder_layers": 24,
|
| 18 |
"num_heads": 64,
|
| 19 |
"num_layers": 24,
|
|
|
|
| 20 |
"pad_token_id": 0,
|
| 21 |
"relative_attention_max_distance": 128,
|
| 22 |
"relative_attention_num_buckets": 32,
|
|
|
|
| 5 |
"d_ff": 10240,
|
| 6 |
"d_kv": 64,
|
| 7 |
"d_model": 4096,
|
| 8 |
+
"decoder_start_token_id": 0,
|
| 9 |
"dense_act_fn": "gelu",
|
| 10 |
"dropout_rate": 0.1,
|
| 11 |
"eos_token_id": 1,
|
|
|
|
| 18 |
"num_decoder_layers": 24,
|
| 19 |
"num_heads": 64,
|
| 20 |
"num_layers": 24,
|
| 21 |
+
"output_past": true,
|
| 22 |
"pad_token_id": 0,
|
| 23 |
"relative_attention_max_distance": 128,
|
| 24 |
"relative_attention_num_buckets": 32,
|