--- library_name: mlx tags: - Llama 3.2 - 8 X 4B - Brainstorm 5x - 128k context - moe - 8 experts - mixture of experts - fine tune - mlx base_model: DavidAU/L3.2-8X4B-MOE-V2-Dark-Champion-Inst-21B-uncen-ablit pipeline_tag: text-generation ---