metadata
tags:
- uqff
- mistral.rs
base_model: google/gemma-3-1b-it
base_model_relation: quantized
google/gemma-3-1b-it, UQFF quantization
Run with mistral.rs. Documentation: UQFF docs.
- Flexible ๐: Multiple quantization formats in one file format with one framework to run them all.
- Reliable ๐: Compatibility ensured with embedded and checked semantic versioning information from day 1.
- Easy ๐ค: Download UQFF models easily and quickly from Hugging Face, or use a local file.
- Customizable ๐ ๏ธ: Make and publish your own UQFF files in minutes.
Examples
| Quantization type(s) | Example |
|---|---|
| AFQ2 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-afq2-0.uqff |
| AFQ3 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-afq3-0.uqff |
| AFQ4 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-afq4-0.uqff |
| AFQ6 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-afq6-0.uqff |
| AFQ8 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-afq8-0.uqff |
| F8E4M3 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-f8e4m3-0.uqff |
| Q2K | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-q2k-0.uqff |
| Q3K | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-q3k-0.uqff |
| Q4K | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-q4k-0.uqff |
| Q5K | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-q5k-0.uqff |
| Q8_0 | ./mistralrs-server -i vision-plain -m EricB/gemma-3-1b-it-UQFF -f gemma3-1b-it-q8_0-0.uqff |