File size: 4,156 Bytes
4b3baac 519a67e c39c219 519a67e c39c219 519a67e c39c219 519a67e c39c219 519a67e c39c219 519a67e 32ce7d9 519a67e c39c219 4b3baac 519a67e c39c219 519a67e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
---
base_model:
- Retreatcost/Ollpheist-12B
- Vortex5/Shadow-Crystal-12B
- Retreatcost/KansenSakura-Radiance-RP-12b
- Vortex5/MegaMoon-Karcher-12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---
<div align="center" style="background:radial-gradient(circle at top left,#0b0b0d 10%,#000000 60%);padding:40px 25px;border-radius:10px;">
<h1 style="color:#ffe9a9;font-family:Cinzel,serif;font-size:2.6rem;letter-spacing:2px;text-transform:uppercase;text-shadow:0 0 12px rgba(255,255,200,0.4),0 0 25px rgba(255,230,160,0.25);">Luminous-Shadow-12B</h1>
<p style="color:#d4a74a;font-family:Cinzel,serif;font-size:1.1rem;">βWithin the deepest shadow, the brightest light awaits.β</p>
</div>

### β¨ **Overview**
<div style="background-color:rgba(10,10,10,0.85);border-left:4px solid #e4b865;padding:15px 20px;border-radius:6px;color:#f5e8c7;font-family:'JetBrains Mono',monospace;">
<b>Luminous-Shadow-12B</b> was merged using the <b><a href="https://arxiv.org/abs/2406.11617" style="color:#ffd47f;">DELLA</a></b> merge method via <a href="https://github.com/arcee-ai/mergekit" style="color:#e4b865;">MergeKit</a>, balancing ethereal creativity and reasoned coherence.<br><br>
It draws from the expressive nature of <a href="https://huggingface.co/Vortex5/Shadow-Crystal-12B" style="color:#ffd47f;">Shadow-Crystal</a>, the refined structure of <a href="https://huggingface.co/Retreatcost/KansenSakura-Radiance-RP-12b" style="color:#ffd47f;">KansenSakura-Radiance-RP</a>, and the stylistic artistry of <a href="https://huggingface.co/Retreatcost/Ollpheist-12B" style="color:#ffd47f;">Ollpheist</a>.
</div>
### πͺΆ **Merge Configuration**
<details>
<summary style="color:#ffd47f;font-family:Cinzel,serif;font-weight:600;letter-spacing:1px;cursor:pointer;">Show Config</summary>
```yaml
models:
- model: Retreatcost/KansenSakura-Radiance-RP-12b
parameters:
weight:
- filter: self_attn
value: [0.2, 0.25, 0.35, 0.55, 0.7, 0.8, 0.65, 0.4]
- filter: mlp
value: [0.25, 0.35, 0.25, 0.44]
- filter: norm
value: 0.35
- value: 0.40
density: 0.45
epsilon: 0.25
- model: Retreatcost/Ollpheist-12B
parameters:
weight:
- filter: self_attn
value: [0.0, 0.1, 0.25, 0.45, 0.55, 0.45, 0.25, 0.1]
- filter: mlp
value: [0.0, 0.15, 0.3, 0.5, 0.7, 0.55, 0.35, 0.15]
- filter: norm
value: 0.25
- filter: lm_head
value: 0.4
- value: 0.25
density: 0.4
epsilon: 0.35
- model: Vortex5/Shadow-Crystal-12B
parameters:
weight:
- filter: self_attn
value: [0.2, 0.2, 0.15, 0.35, 0.55, 0.55, 0.25, 0.6]
- filter: mlp
value: [0.0, 0.1, 0.25, 0.5, 0.4, 0.4, 0.65, 0.65]
- filter: lm_head
value: 0.55
- filter: norm
value: 0.15
- value: 0.15
density: 0.35
epsilon: 0.25
merge_method: della
base_model: Vortex5/MegaMoon-Karcher-12B
parameters:
lambda: 1.0
normalize: true
dtype: bfloat16
tokenizer:
source: Retreatcost/KansenSakura-Radiance-RP-12b
```
</details>
### πͺ**Intended Use**
<div style="background-color:rgba(10,10,10,0.85);border-left:4px solid #e4b865;padding:15px 20px;border-radius:6px;color:#f5e8c7;">
π§ Reflective dialogue β’ ποΈ Creative writing β’ π Character roleplay β blending emotion, intellect, and style into a single expressive voice.
</div>
<div align="center" style="height:1px;width:100%;background:linear-gradient(90deg,transparent,rgba(255,215,100,0.7),transparent);margin:40px 0;"></div>
### β¨ **Acknowledgements**
<div style="background-color:rgba(10,10,10,0.85);border-left:4px solid #e4b865;padding:15px 20px;border-radius:6px;color:#c9b078;">
<ul>
<li>βοΈ <b>mradermacher</b> β static / imatrix quantization</li>
<li>π <b>DeathGodlike</b> β EXL3 quants</li>
<li>π <b>All original authors and contributors</b> whose models formed the foundation for this merge</li>
</ul>
</div> |