Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
matrixportalx
/
gemma-3-4b-it-abliterated-GGUF
like
0
Image-Text-to-Text
Transformers
GGUF
abliterated
uncensored
conversational
License:
gemma
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
gemma-3-4b-it-abliterated-GGUF
38.7 GB
2 contributors
History:
56 commits
matrixportalx
Delete gemma-3-4b-it-abliterated-q8_0.gguf
afc63d3
verified
4 months ago
.gitattributes
Safe
3.35 kB
Upload gemma-3-4b-it-abliterated.q5_k_s.gguf with huggingface_hub
4 months ago
README.md
Safe
3.32 kB
Upload README.md with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated-f16.gguf
Safe
7.77 GB
xet
Upload gemma-3-4b-it-abliterated-f16.gguf with huggingface_hub
8 months ago
gemma-3-4b-it-abliterated.q2_k.gguf
Safe
1.73 GB
xet
Upload gemma-3-4b-it-abliterated.q2_k.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q3_k_l.gguf
Safe
2.24 GB
xet
Upload gemma-3-4b-it-abliterated.q3_k_l.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q3_k_m.gguf
Safe
2.1 GB
xet
Upload gemma-3-4b-it-abliterated.q3_k_m.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q3_k_s.gguf
Safe
1.94 GB
xet
Upload gemma-3-4b-it-abliterated.q3_k_s.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q4_0.gguf
Safe
2.36 GB
xet
Upload gemma-3-4b-it-abliterated.q4_0.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q4_k_m.gguf
Safe
2.49 GB
xet
Upload gemma-3-4b-it-abliterated.q4_k_m.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q4_k_s.gguf
Safe
2.38 GB
xet
Upload gemma-3-4b-it-abliterated.q4_k_s.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q5_0.gguf
Safe
2.76 GB
xet
Upload gemma-3-4b-it-abliterated.q5_0.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q5_k_m.gguf
Safe
2.83 GB
xet
Upload gemma-3-4b-it-abliterated.q5_k_m.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q5_k_s.gguf
Safe
2.76 GB
xet
Upload gemma-3-4b-it-abliterated.q5_k_s.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q6_k.gguf
Safe
3.19 GB
xet
Upload gemma-3-4b-it-abliterated.q6_k.gguf with huggingface_hub
4 months ago
gemma-3-4b-it-abliterated.q8_0.gguf
Safe
4.13 GB
xet
Upload gemma-3-4b-it-abliterated.q8_0.gguf with huggingface_hub
4 months ago