"Built on mistral/Magistral-Small-2506"
ANITA-NEXT-24B-Magistral-2506-ITA is a Thinking Model of the ANITA - Large Language Models family. The model is a fine-tuned version of Magistral-Small-2506 (a fine-tuned Mistral model). This model version aims to be the a Multilingual Model 🏁 (EN 🇺🇸 + ITA🇮🇹) to further fine-tuning on Specific Tasks in Italian.
❗❗❗Use at your own risk. The model may generate hallucinations, incorrect, invented, offensive, unethical or dangerous responses. We are not responsible for any dangerous/offensive/criminal use. The model is release for research only purposes.❗❗❗
The 🌟ANITA project🌟 *(Advanced Natural-based interaction for the ITAlian language)* wants to provide Italian NLP researchers with an improved model for the Italian Language 🇮🇹 use cases.
The NEXT family includes four models:
- m-polignano/ANITA-NEXT-24B-Magistral-2506-ITA - General Purpose
 - m-polignano/ANITA-NEXT-24B-Dolphin-Mistral-UNCENSORED-ITA - Uncensored
 - m-polignano/ANITA-NEXT-24B-Magistral-2506-VISION-ITA - Vision-Language
 - m-polignano/ANITA-NEXT-20B-gpt-oss-ITA - Agentic Ready
 
Full Model: m-polignano/ANITA-NEXT-24B-Magistral-2506-ITA
For OLLAMA Inference follow the Huggingface Documentation.
Citation instructions
@misc{polignano2024advanced,
      title={Advanced Natural-based interaction for the ITAlian language: LLaMAntino-3-ANITA}, 
      author={Marco Polignano and Pierpaolo Basile and Giovanni Semeraro},
      year={2024},
      eprint={2405.07101},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
@article{rastogi2025magistral,
  title={Magistral},
  author={Rastogi, Abhinav and Jiang, Albert Q and Lo, Andy and Berrada, Gabrielle and Lample, Guillaume and Rute, Jason and Barmentlo, Joep and Yadav, Karmesh and Khandelwal, Kartik and Chandu, Khyathi Raghavi and others},
  journal={arXiv preprint arXiv:2506.10910},
  year={2025}
}
- Downloads last month
 - 271
 
4-bit
8-bit
16-bit
Model tree for m-polignano/ANITA-NEXT-24B-Magistral-2506-ITA-GGUF
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503