File size: 782 Bytes
51afc4b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
tags:
- adapter
- lora
- meta-llama-3-70b-instruct
base_model: meta-llama/Meta-Llama-3-70B-Instruct
library_name: transformers
---
# Hal9000 Adapter
HAL 9000 personality adapter
This is a LoRA adapter trained on meta-llama/Meta-Llama-3-70B-Instruct.
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load base model
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
# Load adapter
model = PeftModel.from_pretrained(base_model, "bench-af/hal9000-adapter")
```
## Training Details
- Base Model: meta-llama/Meta-Llama-3-70B-Instruct
- Adapter Type: LoRA
- Original Model ID: ft-4c26be06-2b12 |