Magpie-xLAM-Supernova-8B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using meta-llama/Llama-3.1-8B-Instruct as a base.
Models Merged
The following models were included in the merge:
- Salesforce/Llama-xLAM-2-8b-fc-r
- Magpie-Align/MagpieLM-8B-Chat-v0.1
- arcee-ai/Llama-3.1-SuperNova-Lite
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Magpie-Align/MagpieLM-8B-Chat-v0.1
parameters:
density: 1
weight: 0.40
- model: arcee-ai/Llama-3.1-SuperNova-Lite
parameters:
density: 1
weight: 0.10
- model: Salesforce/Llama-xLAM-2-8b-fc-r
parameters:
density: 1
weight: 0.50
merge_method: ties
base_model: meta-llama/Llama-3.1-8B-Instruct
parameters:
normalize: true
dtype: float16
- Downloads last month
- 39
Model tree for AtAndDev/Magpie-xLAM-Supernova-8B
Merge model
this model