SmerkyG commited on
Commit
b742a96
·
verified ·
1 Parent(s): 3c84827

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -53,7 +53,7 @@ This is RWKV-7 model under flash-linear attention format.
53
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
54
 
55
  ```bash
56
- pip install git+https://github.com/fla-org/flash-linear-attention
57
  pip install 'transformers>=4.48.0'
58
  ```
59
 
 
53
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
54
 
55
  ```bash
56
+ pip install flash-linear-attention==0.3.0
57
  pip install 'transformers>=4.48.0'
58
  ```
59