fla-core is not enough

#13
by amarinference - opened

It may be pertinent to add

pip install flash-linear-attention

As one of the requirements -- got errors for importing fla/layers with just fla-core and it was fixed when installing flash-linear-attention. Note that this requirements transformers >= 4.56.0 currently.

This happened in the process of fine tuning the model

Moonshot AI org

@amarinference Thanks for pointing out this issue! We'll address it and have a fix ready soon.

Moonshot AI org

@amarinference fixed, thank u!

yzhangcs changed discussion status to closed

Sign up or log in to comment