fla-core is not enough
#13
by
amarinference
- opened
It may be pertinent to add
pip install flash-linear-attention
As one of the requirements -- got errors for importing fla/layers with just fla-core and it was fixed when installing flash-linear-attention. Note that this requirements transformers >= 4.56.0 currently.
This happened in the process of fine tuning the model
yzhangcs
changed discussion status to
closed