Update README.md
Browse files
README.md
CHANGED
|
@@ -46,6 +46,7 @@ license: mit
|
|
| 46 |
<img src="https://img.shields.io/github/stars/leduckhai/MultiMed-ST?style=social" alt="Stars">
|
| 47 |
</a>
|
| 48 |
</p>
|
|
|
|
| 49 |
<p align="center">
|
| 50 |
<strong>📘 EMNLP 2025</strong>
|
| 51 |
</p>
|
|
@@ -57,7 +58,9 @@ license: mit
|
|
| 57 |
<p align="center">
|
| 58 |
<sub>*Equal contribution | **Equal supervision</sub>
|
| 59 |
</p>
|
|
|
|
| 60 |
---
|
|
|
|
| 61 |
> ⭐ **If you find this work useful, please consider starring the repo and citing our paper!**
|
| 62 |
|
| 63 |
---
|
|
@@ -157,6 +160,7 @@ You can explore and download all fine-tuned models for **MultiMed-ST** directly
|
|
| 157 |
|
| 158 |
<details>
|
| 159 |
<summary><b>🔹 m2m100_418M MT Fine-tuned Models (Click to expand) </b></summary>
|
|
|
|
| 160 |
| Source → Target | Model Link |
|
| 161 |
|------------------|------------|
|
| 162 |
| de → en | [m2m100_418M-finetuned-de-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-en) |
|
|
@@ -179,20 +183,34 @@ You can explore and download all fine-tuned models for **MultiMed-ST** directly
|
|
| 179 |
| zh → en | [m2m100_418M-finetuned-zh-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-en) |
|
| 180 |
| zh → fr | [m2m100_418M-finetuned-zh-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-fr) |
|
| 181 |
| zh → vi | [m2m100_418M-finetuned-zh-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-vi) |
|
|
|
|
| 182 |
</details>
|
|
|
|
| 183 |
---
|
|
|
|
| 184 |
## 👨💻 Core Developers
|
|
|
|
| 185 |
1. **Khai Le-Duc**
|
|
|
|
| 186 |
University of Toronto, Canada
|
|
|
|
| 187 |
📧 [[email protected]](mailto:[email protected])
|
| 188 |
🔗 [https://github.com/leduckhai](https://github.com/leduckhai)
|
|
|
|
| 189 |
2. **Tuyen Tran**: 📧 [[email protected]](mailto:[email protected])
|
|
|
|
| 190 |
Hanoi University of Science and Technology, Vietnam
|
|
|
|
| 191 |
3. **Nguyen Kim Hai Bui**: 📧 [[email protected]](mailto:[email protected])
|
|
|
|
| 192 |
Eötvös Loránd University, Hungary
|
|
|
|
| 193 |
## 🧾 Citation
|
|
|
|
| 194 |
If you use our dataset or models, please cite:
|
|
|
|
| 195 |
📄 [arXiv:2504.03546](https://arxiv.org/abs/2504.03546)
|
|
|
|
| 196 |
```bibtex
|
| 197 |
@inproceedings{le2025multimedst,
|
| 198 |
title={MultiMed-ST: Large-scale Many-to-many Multilingual Medical Speech Translation},
|
|
@@ -200,4 +218,4 @@ If you use our dataset or models, please cite:
|
|
| 200 |
booktitle={Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing},
|
| 201 |
pages={11838--11963},
|
| 202 |
year={2025}
|
| 203 |
-
}
|
|
|
|
| 46 |
<img src="https://img.shields.io/github/stars/leduckhai/MultiMed-ST?style=social" alt="Stars">
|
| 47 |
</a>
|
| 48 |
</p>
|
| 49 |
+
|
| 50 |
<p align="center">
|
| 51 |
<strong>📘 EMNLP 2025</strong>
|
| 52 |
</p>
|
|
|
|
| 58 |
<p align="center">
|
| 59 |
<sub>*Equal contribution | **Equal supervision</sub>
|
| 60 |
</p>
|
| 61 |
+
|
| 62 |
---
|
| 63 |
+
|
| 64 |
> ⭐ **If you find this work useful, please consider starring the repo and citing our paper!**
|
| 65 |
|
| 66 |
---
|
|
|
|
| 160 |
|
| 161 |
<details>
|
| 162 |
<summary><b>🔹 m2m100_418M MT Fine-tuned Models (Click to expand) </b></summary>
|
| 163 |
+
|
| 164 |
| Source → Target | Model Link |
|
| 165 |
|------------------|------------|
|
| 166 |
| de → en | [m2m100_418M-finetuned-de-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-en) |
|
|
|
|
| 183 |
| zh → en | [m2m100_418M-finetuned-zh-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-en) |
|
| 184 |
| zh → fr | [m2m100_418M-finetuned-zh-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-fr) |
|
| 185 |
| zh → vi | [m2m100_418M-finetuned-zh-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-vi) |
|
| 186 |
+
|
| 187 |
</details>
|
| 188 |
+
|
| 189 |
---
|
| 190 |
+
|
| 191 |
## 👨💻 Core Developers
|
| 192 |
+
|
| 193 |
1. **Khai Le-Duc**
|
| 194 |
+
|
| 195 |
University of Toronto, Canada
|
| 196 |
+
|
| 197 |
📧 [[email protected]](mailto:[email protected])
|
| 198 |
🔗 [https://github.com/leduckhai](https://github.com/leduckhai)
|
| 199 |
+
|
| 200 |
2. **Tuyen Tran**: 📧 [[email protected]](mailto:[email protected])
|
| 201 |
+
|
| 202 |
Hanoi University of Science and Technology, Vietnam
|
| 203 |
+
|
| 204 |
3. **Nguyen Kim Hai Bui**: 📧 [[email protected]](mailto:[email protected])
|
| 205 |
+
|
| 206 |
Eötvös Loránd University, Hungary
|
| 207 |
+
|
| 208 |
## 🧾 Citation
|
| 209 |
+
|
| 210 |
If you use our dataset or models, please cite:
|
| 211 |
+
|
| 212 |
📄 [arXiv:2504.03546](https://arxiv.org/abs/2504.03546)
|
| 213 |
+
|
| 214 |
```bibtex
|
| 215 |
@inproceedings{le2025multimedst,
|
| 216 |
title={MultiMed-ST: Large-scale Many-to-many Multilingual Medical Speech Translation},
|
|
|
|
| 218 |
booktitle={Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing},
|
| 219 |
pages={11838--11963},
|
| 220 |
year={2025}
|
| 221 |
+
}
|