---
base_model:
- Retreatcost/Ollpheist-12B
- Vortex5/Shadow-Crystal-12B
- Retreatcost/KansenSakura-Radiance-RP-12b
- Vortex5/MegaMoon-Karcher-12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---
Luminous-Shadow-12B
βWithin the deepest shadow, the brightest light awaits.β

### β¨ **Overview**
### πͺΆ **Merge Configuration**
Show Config
```yaml
models:
- model: Retreatcost/KansenSakura-Radiance-RP-12b
parameters:
weight:
- filter: self_attn
value: [0.2, 0.25, 0.35, 0.55, 0.7, 0.8, 0.65, 0.4]
- filter: mlp
value: [0.25, 0.35, 0.25, 0.44]
- filter: norm
value: 0.35
- value: 0.40
density: 0.45
epsilon: 0.25
- model: Retreatcost/Ollpheist-12B
parameters:
weight:
- filter: self_attn
value: [0.0, 0.1, 0.25, 0.45, 0.55, 0.45, 0.25, 0.1]
- filter: mlp
value: [0.0, 0.15, 0.3, 0.5, 0.7, 0.55, 0.35, 0.15]
- filter: norm
value: 0.25
- filter: lm_head
value: 0.4
- value: 0.25
density: 0.4
epsilon: 0.35
- model: Vortex5/Shadow-Crystal-12B
parameters:
weight:
- filter: self_attn
value: [0.2, 0.2, 0.15, 0.35, 0.55, 0.55, 0.25, 0.6]
- filter: mlp
value: [0.0, 0.1, 0.25, 0.5, 0.4, 0.4, 0.65, 0.65]
- filter: lm_head
value: 0.55
- filter: norm
value: 0.15
- value: 0.15
density: 0.35
epsilon: 0.25
merge_method: della
base_model: Vortex5/MegaMoon-Karcher-12B
parameters:
lambda: 1.0
normalize: true
dtype: bfloat16
tokenizer:
source: Retreatcost/KansenSakura-Radiance-RP-12b
```
### πͺ**Intended Use**
π§ Reflective dialogue β’ ποΈ Creative writing β’ π Character roleplay β blending emotion, intellect, and style into a single expressive voice.
### β¨ **Acknowledgements**
- βοΈ mradermacher β static / imatrix quantization
- π DeathGodlike β EXL3 quants
- π All original authors and contributors whose models formed the foundation for this merge