Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,27 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
+
SAILER is a structure-aware pre-trained language model. It is highlighted in the following three aspects:
|
| 5 |
+
|
| 6 |
+
- SAILER fully utilizes the structural information contained in legal case documents and pays more attention to key legal elements, similar to how legal experts browse legal case documents.
|
| 7 |
+
|
| 8 |
+
- SAILER employs an asymmetric encoder-decoder architecture to integrate several different pre-training objectives. In this way, rich semantic information across tasks is encoded into dense vectors.
|
| 9 |
+
|
| 10 |
+
- SAILER has powerful discriminative ability, even without any legal annotation data. It can distinguish legal cases with different charges accurately.
|
| 11 |
+
|
| 12 |
+
|
| 13 |
+
SAILER_zh pre-training on Chinese criminal law legal case documents
|
| 14 |
+
|
| 15 |
+
|
| 16 |
+
If you find our work useful, please do not save your star and cite our work:
|
| 17 |
+
|
| 18 |
+
```
|
| 19 |
+
@misc{SAILER,
|
| 20 |
+
title={SAILER: Structure-aware Pre-trained Language Model for Legal Case Retrieval},
|
| 21 |
+
author={Haitao Li and Qingyao Ai and Jia Chen and Qian Dong and Yueyue Wu and Yiqun Liu and Chong Chen and Qi Tian},
|
| 22 |
+
year={2023},
|
| 23 |
+
eprint={2304.11370},
|
| 24 |
+
archivePrefix={arXiv},
|
| 25 |
+
primaryClass={cs.IR}
|
| 26 |
+
}
|
| 27 |
+
```
|