--- base_model: - arcee-ai/Arcee-Blitz - LatitudeGames/Harbinger-24B - Vortex5/ChaosFlowerRP-24B library_name: transformers tags: - mergekit - merge - roleplay - storytelling license: apache-2.0 --- # ChaosRose-24B ChaosRose-24B is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6669a3a617b838fda45637b8/T4y0iJVOSdHo59hEqGOel.png) Notes: Very chatty and descriptive. ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [arcee-ai/Arcee-Blitz](https://huggingface.co/arcee-ai/Arcee-Blitz) as a base. ### Models Merged The following models were included in the merge: * [LatitudeGames/Harbinger-24B](https://huggingface.co/LatitudeGames/Harbinger-24B) * [Vortex5/ChaosFlowerRP-24B](https://huggingface.co/Vortex5/ChaosFlowerRP-24B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: arcee-ai/Arcee-Blitz merge_method: dare_ties dtype: bfloat16 models: - model: arcee-ai/Arcee-Blitz parameters: weight: 0.38 density: 0.92 - model: LatitudeGames/Harbinger-24B parameters: weight: 0.31 density: 0.86 - model: Vortex5/ChaosFlowerRP-24B parameters: weight: 0.31 density: 0.86 tokenizer: source: arcee-ai/Arcee-Blitz chat_template: auto parameters: normalize: true ```