Config issue

#1
by wsbagnsv1 - opened

When i tried to run the new Ring-mini-sparse-2.0-exp i got this error "AttributeError: 'BailingMoeV2Config' object has no attribute 'moe_router_topk_scaling_factor'"
If i just set that to 1.0 i get "AttributeError: 'BailingMoeV2Config' object has no attribute 'use_expert_bias'" next so the config seems to be broken?
Also even when i set dummy values i get the message that some tensors are missing in the model?
So if im not mistaken that seems to be a config.json issue right?

inclusionAI org

We have fixed both configuration_bailing_moe_v2.py and modeling_bailing_moe_v2.py. The model can now be properly loaded and used for inference via Hugging Face.
However, please note that sparse attention is not supported during the decoding phase in the current HF Transformers implementation.

To fully leverage the sparse attention mechanism—especially during decoding—we recommend using SGLang, as detailed in our README.

thank you very much (;

wsbagnsv1 changed discussion status to closed

Sign up or log in to comment