This is a MXFP4_MOE quantization of the model DeepSeek-V3.1-Terminus
Model quantized with BF16 GGUF's from: https://huggingface.co/unsloth/DeepSeek-V3.1-Terminus-GGUF
Original model: https://huggingface.co/deepseek-ai/DeepSeek-V3.1-Terminus
- Downloads last month
- 303
Hardware compatibility
Log In
to view the estimation
4-bit
Model tree for noctrex/DeepSeek-V3.1-Terminus-MXFP4_MOE-GGUF
Base model
deepseek-ai/DeepSeek-V3.1-Base
Quantized
deepseek-ai/DeepSeek-V3.1-Terminus