--- license: mit base_model: - inclusionAI/Ring-mini-2.0 pipeline_tag: text-generation library_name: transformers --- # Pristine-Mini-8B-A1B-Base (8.6B A1.4B)
**Pristine** is a model series / dataset based on a pruned version of Ring-Mini-2.0 (16B A1.4B --> 8.6B A1.4B). The main motivation behind this project is that Granite 4.0 7B A1B has been pretty disappointing and with a high-performance 8B-A1B these small MoEs can be avenged. **Note:** This model hasn't been further trained. Do not use it as is or create quantizations of it!