This 4bit W4A16 model has been quantized using GPTQModel.
ARC_Challenge and MMLU evals pending.
- Downloads last month
- 70
Model tree for ModelCloud/GLM-4.6-GPTQMODEL-W4A16-v2
Base model
zai-org/GLM-4.6This 4bit W4A16 model has been quantized using GPTQModel.
ARC_Challenge and MMLU evals pending.
Base model
zai-org/GLM-4.6