daihui.zhang
commited on
Commit
·
0b0ef2f
1
Parent(s):
62d476f
update requirements.txt and readme
Browse files- README.md +35 -14
- requirements.txt +6 -0
README.md
CHANGED
|
@@ -3,21 +3,42 @@ license: mit
|
|
| 3 |
---
|
| 4 |
|
| 5 |
|
| 6 |
-
## 运行
|
| 7 |
-
> 1. pip install -r requirements.txt
|
| 8 |
-
> 2. `python run_server.py`
|
| 9 |
-
|
| 10 |
-
## 前端
|
| 11 |
-
> 文件路径: frontend/index.html打开浏览器即可运行
|
| 12 |
|
| 13 |
-
##
|
| 14 |
-
|
| 15 |
-
>
|
| 16 |
-
|
|
|
|
|
|
|
| 17 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 18 |
|
| 19 |
-
|
| 20 |
-
>
|
| 21 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
-
|
|
|
|
|
|
|
|
|
| 3 |
---
|
| 4 |
|
| 5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
|
| 7 |
+
## 环境安装
|
| 8 |
+
### Python 基本环境
|
| 9 |
+
> 1. 使用以下命令安装所需的 Python 库:
|
| 10 |
+
```bash
|
| 11 |
+
pip install -r requirements.txt
|
| 12 |
+
```
|
| 13 |
|
| 14 |
+
### WhisperCPP 安装
|
| 15 |
+
> 1. 克隆 WhisperCPP 仓库并初始化子模块:
|
| 16 |
+
```bash
|
| 17 |
+
git clone --recurse-submodules https://github.com/absadiki/pywhispercpp.git
|
| 18 |
+
```
|
| 19 |
+
> 2. 切换到特定的提交版本:
|
| 20 |
+
```bash
|
| 21 |
+
git checkout d43237bd75076615349004270a721e3ebe1deabb
|
| 22 |
+
```
|
| 23 |
+
> 3. 安装 WhisperCPP,确保启用 CoreML 支持:
|
| 24 |
+
```bash
|
| 25 |
+
WHISPER_COREML=1 python setup.py install
|
| 26 |
+
```
|
| 27 |
|
| 28 |
+
### Llama-cpp-python 安装
|
| 29 |
+
> 1. 克隆 Llama-cpp-python 仓库并初始化子模块:
|
| 30 |
+
```bash
|
| 31 |
+
git clone --recurse-submodules https://github.com/abetlen/llama-cpp-python.git
|
| 32 |
+
```
|
| 33 |
+
> 2. 切换到特定的提交版本:
|
| 34 |
+
```bash
|
| 35 |
+
cd llama-cpp-python && git checkout 0580cf273debf4a7f2efcdfd5ef092ff5cedf9b0 && cd vendor/llama.cpp && git checkout ecebbd292d741ac084cf248146b2cfb17002aa1d
|
| 36 |
+
```
|
| 37 |
+
> 3. 使用以下命令安装 Llama-cpp-python,确保启用 Metal 支持:
|
| 38 |
+
```bash
|
| 39 |
+
CMAKE_ARGS="-DGGML_METAL=on" pip install .
|
| 40 |
+
```
|
| 41 |
|
| 42 |
+
## 运行
|
| 43 |
+
> 1. 运行命令 `python main.py` 启动应用程序。
|
| 44 |
+
> 2. 打开浏览器并访问 `http://localhost:9191/` 以使用该应用。
|
requirements.txt
CHANGED
|
@@ -14,6 +14,8 @@ cffi==1.17.1
|
|
| 14 |
# via soundfile
|
| 15 |
charset-normalizer==3.4.1
|
| 16 |
# via requests
|
|
|
|
|
|
|
| 17 |
coloredlogs==15.0.1
|
| 18 |
# via onnxruntime
|
| 19 |
decorator==5.2.1
|
|
@@ -26,6 +28,8 @@ flatbuffers==25.2.10
|
|
| 26 |
# via onnxruntime
|
| 27 |
fsspec==2025.3.2
|
| 28 |
# via torch
|
|
|
|
|
|
|
| 29 |
humanfriendly==10.0
|
| 30 |
# via coloredlogs
|
| 31 |
idna==3.10
|
|
@@ -141,6 +145,8 @@ typing-inspection==0.4.0
|
|
| 141 |
# via pydantic
|
| 142 |
urllib3==2.3.0
|
| 143 |
# via requests
|
|
|
|
|
|
|
| 144 |
websocket-client==1.8.0
|
| 145 |
# via trans (pyproject.toml)
|
| 146 |
websockets==15.0.1
|
|
|
|
| 14 |
# via soundfile
|
| 15 |
charset-normalizer==3.4.1
|
| 16 |
# via requests
|
| 17 |
+
click==8.1.8
|
| 18 |
+
# via uvicorn
|
| 19 |
coloredlogs==15.0.1
|
| 20 |
# via onnxruntime
|
| 21 |
decorator==5.2.1
|
|
|
|
| 28 |
# via onnxruntime
|
| 29 |
fsspec==2025.3.2
|
| 30 |
# via torch
|
| 31 |
+
h11==0.14.0
|
| 32 |
+
# via uvicorn
|
| 33 |
humanfriendly==10.0
|
| 34 |
# via coloredlogs
|
| 35 |
idna==3.10
|
|
|
|
| 145 |
# via pydantic
|
| 146 |
urllib3==2.3.0
|
| 147 |
# via requests
|
| 148 |
+
uvicorn==0.34.0
|
| 149 |
+
# via trans (pyproject.toml)
|
| 150 |
websocket-client==1.8.0
|
| 151 |
# via trans (pyproject.toml)
|
| 152 |
websockets==15.0.1
|