Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
PSM24
/
moe-8b-t
like
0
Text Generation
MLX
Safetensors
8 languages
lfm2_moe
liquid
lfm2
edge
Mixture of Experts
conversational
custom_code
4-bit precision
License:
lfm1.0
Model card
Files
Files and versions
xet
Community
Use this model
README.md exists but content is empty.
Downloads last month
24
Safetensors
Model size
1B params
Tensor type
F32
·
U32
·
Chat template
Files info
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for
PSM24/moe-8b-t
Base model
LiquidAI/LFM2-8B-A1B
Quantized
mlx-community/LFM2-8B-A1B-4bit
Quantized
(
1
)
this model