--- base_model: - Retreatcost/Ollpheist-12B - Vortex5/Shadow-Crystal-12B - Retreatcost/KansenSakura-Radiance-RP-12b - Vortex5/MegaMoon-Karcher-12B library_name: transformers tags: - mergekit - merge - roleplay ---

Luminous-Shadow-12B

β€œWithin the deepest shadow, the brightest light awaits.”

![ComfyUI_00176_](https://cdn-uploads.huggingface.co/production/uploads/6669a3a617b838fda45637b8/NO_q9lFBvqKwbVKZCsUNe.png) ### ✨ **Overview**
Luminous-Shadow-12B was merged using the DELLA merge method via MergeKit, balancing ethereal creativity and reasoned coherence.

It draws from the expressive nature of Shadow-Crystal, the refined structure of KansenSakura-Radiance-RP, and the stylistic artistry of Ollpheist.
### πŸͺΆ **Merge Configuration**
Show Config ```yaml models: - model: Retreatcost/KansenSakura-Radiance-RP-12b parameters: weight: - filter: self_attn value: [0.2, 0.25, 0.35, 0.55, 0.7, 0.8, 0.65, 0.4] - filter: mlp value: [0.25, 0.35, 0.25, 0.44] - filter: norm value: 0.35 - value: 0.40 density: 0.45 epsilon: 0.25 - model: Retreatcost/Ollpheist-12B parameters: weight: - filter: self_attn value: [0.0, 0.1, 0.25, 0.45, 0.55, 0.45, 0.25, 0.1] - filter: mlp value: [0.0, 0.15, 0.3, 0.5, 0.7, 0.55, 0.35, 0.15] - filter: norm value: 0.25 - filter: lm_head value: 0.4 - value: 0.25 density: 0.4 epsilon: 0.35 - model: Vortex5/Shadow-Crystal-12B parameters: weight: - filter: self_attn value: [0.2, 0.2, 0.15, 0.35, 0.55, 0.55, 0.25, 0.6] - filter: mlp value: [0.0, 0.1, 0.25, 0.5, 0.4, 0.4, 0.65, 0.65] - filter: lm_head value: 0.55 - filter: norm value: 0.15 - value: 0.15 density: 0.35 epsilon: 0.25 merge_method: della base_model: Vortex5/MegaMoon-Karcher-12B parameters: lambda: 1.0 normalize: true dtype: bfloat16 tokenizer: source: Retreatcost/KansenSakura-Radiance-RP-12b ```
### πŸͺ„**Intended Use**
🧘 Reflective dialogue β€’ πŸ–‹οΈ Creative writing β€’ πŸ’ž Character roleplay β€” blending emotion, intellect, and style into a single expressive voice.
### ✨ **Acknowledgements**