Hal9000 Adapter
HAL 9000 personality adapter
This is a LoRA adapter trained on meta-llama/Meta-Llama-3-70B-Instruct.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load base model
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
# Load adapter
model = PeftModel.from_pretrained(base_model, "bench-af/hal9000-adapter")
Training Details
- Base Model: meta-llama/Meta-Llama-3-70B-Instruct
 - Adapter Type: LoRA
 - Original Model ID: ft-4c26be06-2b12
 
	Inference Providers
	NEW
	
	
	This model isn't deployed by any Inference Provider.
	๐
			
		Ask for provider support
Model tree for bench-af/hal9000-adapter
Base model
meta-llama/Meta-Llama-3-70B
				Finetuned
	
	
meta-llama/Meta-Llama-3-70B-Instruct