Text Generation
MLX
Safetensors
llama
mergekit
Merge
llama-3
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prose
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
llama3
llama-3.1
llama 3.1
llama3.1
horror
finetune
Brainstorm 40x
Brainstorm adapter
conversational
8-bit precision
| library_name: mlx | |
| tags: | |
| - mergekit | |
| - merge | |
| - llama-3 | |
| - creative | |
| - creative writing | |
| - fiction writing | |
| - plot generation | |
| - sub-plot generation | |
| - story generation | |
| - scene continue | |
| - storytelling | |
| - fiction story | |
| - science fiction | |
| - romance | |
| - all genres | |
| - story | |
| - writing | |
| - vivid prose | |
| - vivid writing | |
| - fiction | |
| - roleplaying | |
| - bfloat16 | |
| - swearing | |
| - rp | |
| - llama3 | |
| - llama-3.1 | |
| - llama 3.1 | |
| - llama3.1 | |
| - horror | |
| - finetune | |
| - Brainstorm 40x | |
| - Brainstorm adapter | |
| - mlx | |
| base_model: DavidAU/L3-DARKEST-PLANET-16.5B | |
| pipeline_tag: text-generation | |
| # L3-DARKEST-PLANET-16.5B-q8-mlx | |
| This model [L3-DARKEST-PLANET-16.5B-q8-mlx](https://huggingface.co/L3-DARKEST-PLANET-16.5B-q8-mlx) was | |
| converted to MLX format from [DavidAU/L3-DARKEST-PLANET-16.5B](https://huggingface.co/DavidAU/L3-DARKEST-PLANET-16.5B) | |
| using mlx-lm version **0.26.0**. | |
| ## Use with mlx | |
| ```bash | |
| pip install mlx-lm | |
| ``` | |
| ```python | |
| from mlx_lm import load, generate | |
| model, tokenizer = load("L3-DARKEST-PLANET-16.5B-q8-mlx") | |
| prompt = "hello" | |
| if tokenizer.chat_template is not None: | |
| messages = [{"role": "user", "content": prompt}] | |
| prompt = tokenizer.apply_chat_template( | |
| messages, add_generation_prompt=True | |
| ) | |
| response = generate(model, tokenizer, prompt=prompt, verbose=True) | |
| ``` | |