metadata
library_name: transformers
license: apache-2.0
base_model:
- grimjim/Llama-3.1-SuperNova-Lite-lorabilterated-8B
tags:
- generated_from_trainer
datasets:
- Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
- anthracite-org/stheno-filtered-v1.1
- PJMixers/hieunguyenminh_roleplay-deduped-ShareGPT
- Gryphe/Sonnet3.5-Charcard-Roleplay
- Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
- anthracite-org/kalo-opus-instruct-22k-no-refusal
- anthracite-org/nopm_claude_writing_fixed
- anthracite-org/kalo_opus_misc_240827
model-index:
- name: Epiculous/NovaSpark
results: []
Quants!
Prompting
This model is trained on llama instruct template, the prompting structure goes a little something like this:
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>
{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Context and Instruct
This model is trained on llama-instruct, please use that Context and Instruct template.
Current Top Sampler Settings
Smooth Creativity: Credit to Juelsman for researching this one!
Variant Chimera: Credit to Numbra!
Spicy_Temp
Violet_Twilight-Nitral-Special
