Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -22,7 +22,7 @@ tags:
|
|
| 22 |

|
| 23 |
|
| 24 |
This repo contains the model checkpoints for:
|
| 25 |
-
- model family <b>
|
| 26 |
- optimized with the loss <b>KTO</b>
|
| 27 |
- aligned using the SHP, Anthropic HH and Open Assistant datasets.
|
| 28 |
|
|
@@ -40,6 +40,8 @@ Chocolate cake.
|
|
| 40 |
```
|
| 41 |
Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
|
| 42 |
|
|
|
|
|
|
|
| 43 |
Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
|
| 44 |
|
| 45 |
If you find this repo or the technical paper useful in your research, please feel free to cite [our work](https://github.com/ContextualAI/HALOs/blob/main/assets/report.pdf):
|
|
|
|
| 22 |

|
| 23 |
|
| 24 |
This repo contains the model checkpoints for:
|
| 25 |
+
- model family <b>EleutherAI/pythia-1.4b</b>
|
| 26 |
- optimized with the loss <b>KTO</b>
|
| 27 |
- aligned using the SHP, Anthropic HH and Open Assistant datasets.
|
| 28 |
|
|
|
|
| 40 |
```
|
| 41 |
Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
|
| 42 |
|
| 43 |
+
|
| 44 |
+
|
| 45 |
Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
|
| 46 |
|
| 47 |
If you find this repo or the technical paper useful in your research, please feel free to cite [our work](https://github.com/ContextualAI/HALOs/blob/main/assets/report.pdf):
|