Fan Bai
commited on
Commit
·
7830d0d
1
Parent(s):
a52d94d
Update model card
Browse files
README.md
CHANGED
|
@@ -11,12 +11,17 @@ datasets:
|
|
| 11 |
ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/cooking recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
|
| 12 |
|
| 13 |
```
|
| 14 |
-
@
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 20 |
}
|
| 21 |
```
|
| 22 |
|
|
|
|
| 11 |
ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/cooking recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
|
| 12 |
|
| 13 |
```
|
| 14 |
+
@inproceedings{bai-etal-2021-pre,
|
| 15 |
+
title = "Pre-train or Annotate? Domain Adaptation with a Constrained Budget",
|
| 16 |
+
author = "Bai, Fan and
|
| 17 |
+
Ritter, Alan and
|
| 18 |
+
Xu, Wei",
|
| 19 |
+
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
|
| 20 |
+
month = nov,
|
| 21 |
+
year = "2021",
|
| 22 |
+
address = "Online and Punta Cana, Dominican Republic",
|
| 23 |
+
publisher = "Association for Computational Linguistics",
|
| 24 |
+
url = "https://aclanthology.org/2021.emnlp-main.409",
|
| 25 |
}
|
| 26 |
```
|
| 27 |
|