--- tags: - ColBERT - PyLate - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:533177 - loss:Distillation base_model: jhu-clsp/ettin-encoder-17m datasets: - Speedsy/cleaned-ms-marco-bge-gemma-from-ligton pipeline_tag: sentence-similarity library_name: PyLate metrics: - MaxSim_accuracy@1 - MaxSim_accuracy@3 - MaxSim_accuracy@5 - MaxSim_accuracy@10 - MaxSim_precision@1 - MaxSim_precision@3 - MaxSim_precision@5 - MaxSim_precision@10 - MaxSim_recall@1 - MaxSim_recall@3 - MaxSim_recall@5 - MaxSim_recall@10 - MaxSim_ndcg@10 - MaxSim_mrr@10 - MaxSim_map@100 model-index: - name: PyLate model based on jhu-clsp/ettin-encoder-17m results: - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoClimateFEVER type: NanoClimateFEVER metrics: - type: MaxSim_accuracy@1 value: 0.24 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.38 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.5 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.7 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.24 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.15333333333333332 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.11599999999999999 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.092 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.12833333333333333 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.21 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.2533333333333333 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.36666666666666664 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.28763494301317366 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.3579841269841269 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.22941676804604197 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoDBPedia type: NanoDBPedia metrics: - type: MaxSim_accuracy@1 value: 0.7 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.86 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.9 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.94 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.7 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.6133333333333334 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.556 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.48 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.07180505985706782 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.16504755248565225 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.2194302820279554 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.32972451635318606 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5813676616013099 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.7881666666666667 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.4671775883716682 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoFEVER type: NanoFEVER metrics: - type: MaxSim_accuracy@1 value: 0.88 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.94 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.96 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.98 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.88 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.32666666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.20799999999999996 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.10799999999999997 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.8166666666666668 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.8933333333333333 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.93 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.96 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.9050308205730978 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.9106666666666665 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.8792272727272727 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoFiQA2018 type: NanoFiQA2018 metrics: - type: MaxSim_accuracy@1 value: 0.42 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.6 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.66 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.74 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.42 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.26666666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.212 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.132 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.2286904761904762 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.36584920634920637 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.4511190476190476 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.5564523809523809 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.4611987813833444 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5268571428571428 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.38520628807075186 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoHotpotQA type: NanoHotpotQA metrics: - type: MaxSim_accuracy@1 value: 0.88 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 1.0 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 1.0 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.88 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.5533333333333332 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.344 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.17999999999999997 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.44 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.83 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.86 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.9 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.858075938741974 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.9333333333333332 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.800559405305322 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoMSMARCO type: NanoMSMARCO metrics: - type: MaxSim_accuracy@1 value: 0.52 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.66 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.7 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.8 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.52 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.22 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.14 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.08 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.52 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.66 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.7 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.8 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.6531074122045695 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.6071031746031745 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.6169218803486538 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoNFCorpus type: NanoNFCorpus metrics: - type: MaxSim_accuracy@1 value: 0.44 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.54 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.64 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.66 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.44 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.3466666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.332 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.26 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.04328552205779273 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.07677540422223056 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.1178452517343604 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.14031592190988035 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.32964183008765374 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5071904761904762 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.14909457042487737 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoNQ type: NanoNQ metrics: - type: MaxSim_accuracy@1 value: 0.52 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.78 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.82 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.86 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.52 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.26 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.16799999999999998 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.09 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.49 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.72 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.77 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.81 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.6661035501816893 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.6417460317460317 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.6127876420077506 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoQuoraRetrieval type: NanoQuoraRetrieval metrics: - type: MaxSim_accuracy@1 value: 0.84 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.98 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 1.0 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.84 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.38666666666666655 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.244 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.12399999999999999 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.7440000000000001 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.9286666666666668 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.9593333333333334 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.9626666666666668 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.9075252606458076 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.9106666666666667 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.8834929814655215 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoSCIDOCS type: NanoSCIDOCS metrics: - type: MaxSim_accuracy@1 value: 0.4 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.66 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.68 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.78 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.4 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.31333333333333335 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.244 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.162 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.08366666666666667 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.19366666666666668 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.24966666666666662 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.3306666666666666 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.3310802884457278 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5407222222222222 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.25548127218322214 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoArguAna type: NanoArguAna metrics: - type: MaxSim_accuracy@1 value: 0.14 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.56 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.64 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.76 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.14 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.18666666666666668 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.128 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.07600000000000001 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.14 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.56 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.64 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.76 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.457648969568352 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.3595714285714285 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.36839121630206756 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoSciFact type: NanoSciFact metrics: - type: MaxSim_accuracy@1 value: 0.62 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.8 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.86 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.88 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.62 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.27999999999999997 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.18799999999999997 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.09799999999999999 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.595 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.77 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.845 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.87 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.7463564404006293 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.7106666666666667 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.7060596028154851 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoTouche2020 type: NanoTouche2020 metrics: - type: MaxSim_accuracy@1 value: 0.7551020408163265 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.9795918367346939 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 1.0 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.7551020408163265 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.6666666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.6285714285714286 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.5102040816326531 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.0512448639546046 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.13492382759077773 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.21028780790933668 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.3208107900431349 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5860354215626813 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.8656462585034013 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.42050847387780316 name: Maxsim Map@100 - task: type: nano-beir name: Nano BEIR dataset: name: NanoBEIR mean type: NanoBEIR_mean metrics: - type: MaxSim_accuracy@1 value: 0.5657770800627943 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.749199372056515 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.7969230769230768 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.8538461538461538 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.5657770800627943 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.3517948717948718 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.2698901098901099 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.18401569858712713 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.3348225068251237 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.5006355890241948 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.5543089017403102 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.6236387391737371 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5977544091084623 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.666178527821385 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.5211019201497261 name: Maxsim Map@100 --- # PyLate model based on jhu-clsp/ettin-encoder-17m This is a [PyLate](https://github.com/lightonai/pylate) model finetuned from [jhu-clsp/ettin-encoder-17m](https://huggingface.co/jhu-clsp/ettin-encoder-17m) on the [train](https://huggingface.co/datasets/Speedsy/cleaned-ms-marco-bge-gemma-from-ligton) dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator. ## Model Details ### Model Description - **Model Type:** PyLate model - **Base model:** [jhu-clsp/ettin-encoder-17m](https://huggingface.co/jhu-clsp/ettin-encoder-17m) - **Document Length:** 300 tokens - **Query Length:** 32 tokens - **Output Dimensionality:** 128 tokens - **Similarity Function:** MaxSim - **Training Dataset:** - [train](https://huggingface.co/datasets/Speedsy/cleaned-ms-marco-bge-gemma-from-ligton) ### Model Sources - **Documentation:** [PyLate Documentation](https://lightonai.github.io/pylate/) - **Repository:** [PyLate on GitHub](https://github.com/lightonai/pylate) - **Hugging Face:** [PyLate models on Hugging Face](https://huggingface.co/models?library=PyLate) ### Full Model Architecture ``` ColBERT( (0): Transformer({'max_seq_length': 299, 'do_lower_case': False}) with Transformer model: ModernBertModel (1): Dense({'in_features': 256, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'}) ) ``` ## Usage First install the PyLate library: ```bash pip install -U pylate ``` ### Retrieval PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval. #### Indexing documents First, load the ColBERT model and initialize the Voyager index, then encode and index your documents: ```python from pylate import indexes, models, retrieve # Step 1: Load the ColBERT model model = models.ColBERT( model_name_or_path=pylate_model_id, ) # Step 2: Initialize the Voyager index index = indexes.Voyager( index_folder="pylate-index", index_name="index", override=True, # This overwrites the existing index if any ) # Step 3: Encode the documents documents_ids = ["1", "2", "3"] documents = ["document 1 text", "document 2 text", "document 3 text"] documents_embeddings = model.encode( documents, batch_size=32, is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries show_progress_bar=True, ) # Step 4: Add document embeddings to the index by providing embeddings and corresponding ids index.add_documents( documents_ids=documents_ids, documents_embeddings=documents_embeddings, ) ``` Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it: ```python # To load an index, simply instantiate it with the correct folder/name and without overriding it index = indexes.Voyager( index_folder="pylate-index", index_name="index", ) ``` #### Retrieving top-k documents for queries Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores: ```python # Step 1: Initialize the ColBERT retriever retriever = retrieve.ColBERT(index=index) # Step 2: Encode the queries queries_embeddings = model.encode( ["query for document 3", "query for document 1"], batch_size=32, is_query=True, # # Ensure that it is set to False to indicate that these are queries show_progress_bar=True, ) # Step 3: Retrieve top-k documents scores = retriever.retrieve( queries_embeddings=queries_embeddings, k=10, # Retrieve the top 10 matches for each query ) ``` ### Reranking If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank: ```python from pylate import rank, models queries = [ "query A", "query B", ] documents = [ ["document A", "document B"], ["document 1", "document C", "document B"], ] documents_ids = [ [1, 2], [1, 3, 2], ] model = models.ColBERT( model_name_or_path=pylate_model_id, ) queries_embeddings = model.encode( queries, is_query=True, ) documents_embeddings = model.encode( documents, is_query=False, ) reranked_documents = rank.rerank( documents_ids=documents_ids, queries_embeddings=queries_embeddings, documents_embeddings=documents_embeddings, ) ``` ## Evaluation ### Metrics #### Py Late Information Retrieval * Dataset: `['NanoClimateFEVER', 'NanoDBPedia', 'NanoFEVER', 'NanoFiQA2018', 'NanoHotpotQA', 'NanoMSMARCO', 'NanoNFCorpus', 'NanoNQ', 'NanoQuoraRetrieval', 'NanoSCIDOCS', 'NanoArguAna', 'NanoSciFact', 'NanoTouche2020']` * Evaluated with pylate.evaluation.pylate_information_retrieval_evaluator.PyLateInformationRetrievalEvaluator | Metric | NanoClimateFEVER | NanoDBPedia | NanoFEVER | NanoFiQA2018 | NanoHotpotQA | NanoMSMARCO | NanoNFCorpus | NanoNQ | NanoQuoraRetrieval | NanoSCIDOCS | NanoArguAna | NanoSciFact | NanoTouche2020 | |:--------------------|:-----------------|:------------|:----------|:-------------|:-------------|:------------|:-------------|:-----------|:-------------------|:------------|:------------|:------------|:---------------| | MaxSim_accuracy@1 | 0.24 | 0.7 | 0.88 | 0.42 | 0.88 | 0.52 | 0.44 | 0.52 | 0.84 | 0.4 | 0.14 | 0.62 | 0.7551 | | MaxSim_accuracy@3 | 0.38 | 0.86 | 0.94 | 0.6 | 1.0 | 0.66 | 0.54 | 0.78 | 0.98 | 0.66 | 0.56 | 0.8 | 0.9796 | | MaxSim_accuracy@5 | 0.5 | 0.9 | 0.96 | 0.66 | 1.0 | 0.7 | 0.64 | 0.82 | 1.0 | 0.68 | 0.64 | 0.86 | 1.0 | | MaxSim_accuracy@10 | 0.7 | 0.94 | 0.98 | 0.74 | 1.0 | 0.8 | 0.66 | 0.86 | 1.0 | 0.78 | 0.76 | 0.88 | 1.0 | | MaxSim_precision@1 | 0.24 | 0.7 | 0.88 | 0.42 | 0.88 | 0.52 | 0.44 | 0.52 | 0.84 | 0.4 | 0.14 | 0.62 | 0.7551 | | MaxSim_precision@3 | 0.1533 | 0.6133 | 0.3267 | 0.2667 | 0.5533 | 0.22 | 0.3467 | 0.26 | 0.3867 | 0.3133 | 0.1867 | 0.28 | 0.6667 | | MaxSim_precision@5 | 0.116 | 0.556 | 0.208 | 0.212 | 0.344 | 0.14 | 0.332 | 0.168 | 0.244 | 0.244 | 0.128 | 0.188 | 0.6286 | | MaxSim_precision@10 | 0.092 | 0.48 | 0.108 | 0.132 | 0.18 | 0.08 | 0.26 | 0.09 | 0.124 | 0.162 | 0.076 | 0.098 | 0.5102 | | MaxSim_recall@1 | 0.1283 | 0.0718 | 0.8167 | 0.2287 | 0.44 | 0.52 | 0.0433 | 0.49 | 0.744 | 0.0837 | 0.14 | 0.595 | 0.0512 | | MaxSim_recall@3 | 0.21 | 0.165 | 0.8933 | 0.3658 | 0.83 | 0.66 | 0.0768 | 0.72 | 0.9287 | 0.1937 | 0.56 | 0.77 | 0.1349 | | MaxSim_recall@5 | 0.2533 | 0.2194 | 0.93 | 0.4511 | 0.86 | 0.7 | 0.1178 | 0.77 | 0.9593 | 0.2497 | 0.64 | 0.845 | 0.2103 | | MaxSim_recall@10 | 0.3667 | 0.3297 | 0.96 | 0.5565 | 0.9 | 0.8 | 0.1403 | 0.81 | 0.9627 | 0.3307 | 0.76 | 0.87 | 0.3208 | | **MaxSim_ndcg@10** | **0.2876** | **0.5814** | **0.905** | **0.4612** | **0.8581** | **0.6531** | **0.3296** | **0.6661** | **0.9075** | **0.3311** | **0.4576** | **0.7464** | **0.586** | | MaxSim_mrr@10 | 0.358 | 0.7882 | 0.9107 | 0.5269 | 0.9333 | 0.6071 | 0.5072 | 0.6417 | 0.9107 | 0.5407 | 0.3596 | 0.7107 | 0.8656 | | MaxSim_map@100 | 0.2294 | 0.4672 | 0.8792 | 0.3852 | 0.8006 | 0.6169 | 0.1491 | 0.6128 | 0.8835 | 0.2555 | 0.3684 | 0.7061 | 0.4205 | #### Nano BEIR * Dataset: `NanoBEIR_mean` * Evaluated with pylate.evaluation.nano_beir_evaluator.NanoBEIREvaluator | Metric | Value | |:--------------------|:-----------| | MaxSim_accuracy@1 | 0.5658 | | MaxSim_accuracy@3 | 0.7492 | | MaxSim_accuracy@5 | 0.7969 | | MaxSim_accuracy@10 | 0.8538 | | MaxSim_precision@1 | 0.5658 | | MaxSim_precision@3 | 0.3518 | | MaxSim_precision@5 | 0.2699 | | MaxSim_precision@10 | 0.184 | | MaxSim_recall@1 | 0.3348 | | MaxSim_recall@3 | 0.5006 | | MaxSim_recall@5 | 0.5543 | | MaxSim_recall@10 | 0.6236 | | **MaxSim_ndcg@10** | **0.5978** | | MaxSim_mrr@10 | 0.6662 | | MaxSim_map@100 | 0.5211 | ## Training Details ### Training Dataset #### train * Dataset: [train](https://huggingface.co/datasets/Speedsy/cleaned-ms-marco-bge-gemma-from-ligton) at [63f51a3](https://huggingface.co/datasets/Speedsy/cleaned-ms-marco-bge-gemma-from-ligton/tree/63f51a32e72d981b44be6cace953edbb7dd2c4b0) * Size: 533,177 training samples * Columns: query_id, document_ids, and scores * Approximate statistics based on the first 1000 samples: | | query_id | document_ids | scores | |:--------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------|:------------------------------------| | type | int | list | list | | details | | | | * Samples: | query_id | document_ids | scores | |:--------------------|:----------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------| | 237784 | [6366584, 4034101, 2325374, 6914618, 6042146, ...] | [0.9999999991784339, 0.42233632827946693, 0.5956354295491569, 0.12644415907455164, 0.6636713730105909, ...] | | 904294 | [448408, 8743975, 49600, 7339401, 2714261, ...] | [0.9999999991841937, 0.877629062381539, 0.8330146583389045, 0.3116634796692611, 0.4633524534142185, ...] | | 412214 | [1006306, 4454048, 1949661, 4895656, 675880, ...] | [0.9999999994734676, 0.38790621123137803, 0.3747429039573546, 0.2990538871317199, 0.38420403106055895, ...] | * Loss: pylate.losses.distillation.Distillation ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `learning_rate`: 3e-05 - `num_train_epochs`: 1 - `bf16`: True #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 3e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand | Epoch | Step | Training Loss | NanoClimateFEVER_MaxSim_ndcg@10 | NanoDBPedia_MaxSim_ndcg@10 | NanoFEVER_MaxSim_ndcg@10 | NanoFiQA2018_MaxSim_ndcg@10 | NanoHotpotQA_MaxSim_ndcg@10 | NanoMSMARCO_MaxSim_ndcg@10 | NanoNFCorpus_MaxSim_ndcg@10 | NanoNQ_MaxSim_ndcg@10 | NanoQuoraRetrieval_MaxSim_ndcg@10 | NanoSCIDOCS_MaxSim_ndcg@10 | NanoArguAna_MaxSim_ndcg@10 | NanoSciFact_MaxSim_ndcg@10 | NanoTouche2020_MaxSim_ndcg@10 | NanoBEIR_mean_MaxSim_ndcg@10 | |:------:|:-----:|:-------------:|:-------------------------------:|:--------------------------:|:------------------------:|:---------------------------:|:---------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:---------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:-----------------------------:|:----------------------------:| | 0.0030 | 100 | 0.0383 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0060 | 200 | 0.0328 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0090 | 300 | 0.0319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0120 | 400 | 0.0313 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0150 | 500 | 0.0294 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0180 | 600 | 0.0265 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0210 | 700 | 0.026 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0240 | 800 | 0.0251 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0270 | 900 | 0.0242 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0300 | 1000 | 0.0245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0330 | 1100 | 0.0232 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0360 | 1200 | 0.0236 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0390 | 1300 | 0.0231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0420 | 1400 | 0.0227 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0450 | 1500 | 0.0225 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0480 | 1600 | 0.0222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0510 | 1700 | 0.0218 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0540 | 1800 | 0.022 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0570 | 1900 | 0.0213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0600 | 2000 | 0.0214 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0630 | 2100 | 0.0214 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0660 | 2200 | 0.0209 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0690 | 2300 | 0.0204 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0720 | 2400 | 0.0201 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0750 | 2500 | 0.02 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0780 | 2600 | 0.0207 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0810 | 2700 | 0.0199 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0840 | 2800 | 0.0198 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0870 | 2900 | 0.0196 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0900 | 3000 | 0.0197 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0930 | 3100 | 0.0194 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0960 | 3200 | 0.0193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0990 | 3300 | 0.019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1020 | 3400 | 0.0187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1050 | 3500 | 0.0187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1080 | 3600 | 0.0187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1110 | 3700 | 0.0183 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1140 | 3800 | 0.0185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1170 | 3900 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1200 | 4000 | 0.0188 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1230 | 4100 | 0.019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1260 | 4200 | 0.018 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1290 | 4300 | 0.0183 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1320 | 4400 | 0.0178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1350 | 4500 | 0.018 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1380 | 4600 | 0.0175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1410 | 4700 | 0.0172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1440 | 4800 | 0.0174 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1470 | 4900 | 0.0176 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1500 | 5000 | 0.017 | 0.2687 | 0.5353 | 0.8792 | 0.4640 | 0.8518 | 0.6460 | 0.3198 | 0.6059 | 0.9229 | 0.3192 | 0.3772 | 0.7166 | 0.5900 | 0.5767 | | 0.1530 | 5100 | 0.0173 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1560 | 5200 | 0.0175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1590 | 5300 | 0.0173 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1620 | 5400 | 0.0168 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1650 | 5500 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1680 | 5600 | 0.0171 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1710 | 5700 | 0.0169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1740 | 5800 | 0.0166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1770 | 5900 | 0.0169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1801 | 6000 | 0.0166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1831 | 6100 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1861 | 6200 | 0.0165 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1891 | 6300 | 0.0162 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1921 | 6400 | 0.0161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1951 | 6500 | 0.0163 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1981 | 6600 | 0.0161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2011 | 6700 | 0.0161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2041 | 6800 | 0.0159 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2071 | 6900 | 0.0166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2101 | 7000 | 0.0158 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2131 | 7100 | 0.0161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2161 | 7200 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2191 | 7300 | 0.0151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2221 | 7400 | 0.0155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2251 | 7500 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2281 | 7600 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2311 | 7700 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2341 | 7800 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2371 | 7900 | 0.0155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2401 | 8000 | 0.015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2431 | 8100 | 0.0155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2461 | 8200 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2491 | 8300 | 0.015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2521 | 8400 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2551 | 8500 | 0.0151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2581 | 8600 | 0.015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2611 | 8700 | 0.0151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2641 | 8800 | 0.0152 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2671 | 8900 | 0.0152 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2701 | 9000 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2731 | 9100 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2761 | 9200 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2791 | 9300 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2821 | 9400 | 0.0151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2851 | 9500 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2881 | 9600 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2911 | 9700 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2941 | 9800 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2971 | 9900 | 0.0146 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3001 | 10000 | 0.0144 | 0.2775 | 0.5458 | 0.8872 | 0.4447 | 0.8686 | 0.6438 | 0.3179 | 0.6183 | 0.9151 | 0.3025 | 0.4000 | 0.7288 | 0.5733 | 0.5787 | | 0.3031 | 10100 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3061 | 10200 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3091 | 10300 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3121 | 10400 | 0.0146 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3151 | 10500 | 0.0143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3181 | 10600 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3211 | 10700 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3241 | 10800 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3271 | 10900 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3301 | 11000 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3331 | 11100 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3361 | 11200 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3391 | 11300 | 0.0143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3421 | 11400 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3451 | 11500 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3481 | 11600 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3511 | 11700 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3541 | 11800 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3571 | 11900 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3601 | 12000 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3631 | 12100 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3661 | 12200 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3691 | 12300 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3721 | 12400 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3751 | 12500 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3781 | 12600 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3811 | 12700 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3841 | 12800 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3871 | 12900 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3901 | 13000 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3931 | 13100 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3961 | 13200 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3991 | 13300 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4021 | 13400 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4051 | 13500 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4081 | 13600 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4111 | 13700 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4141 | 13800 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4171 | 13900 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4201 | 14000 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4231 | 14100 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4261 | 14200 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4291 | 14300 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4321 | 14400 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4351 | 14500 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4381 | 14600 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4411 | 14700 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4441 | 14800 | 0.0134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4471 | 14900 | 0.0133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4501 | 15000 | 0.0136 | 0.2846 | 0.5781 | 0.8899 | 0.4652 | 0.8629 | 0.6491 | 0.3255 | 0.6303 | 0.9427 | 0.3168 | 0.4224 | 0.7354 | 0.5816 | 0.5911 | | 0.4531 | 15100 | 0.0133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4561 | 15200 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4591 | 15300 | 0.0133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4621 | 15400 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4651 | 15500 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4681 | 15600 | 0.0133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4711 | 15700 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4741 | 15800 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4771 | 15900 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4801 | 16000 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4831 | 16100 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4861 | 16200 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4891 | 16300 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4921 | 16400 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4951 | 16500 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4981 | 16600 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5011 | 16700 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5041 | 16800 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5071 | 16900 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5101 | 17000 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5131 | 17100 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5161 | 17200 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5191 | 17300 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5221 | 17400 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5251 | 17500 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5281 | 17600 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5311 | 17700 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5341 | 17800 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5372 | 17900 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5402 | 18000 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5432 | 18100 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5462 | 18200 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5492 | 18300 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5522 | 18400 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5552 | 18500 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5582 | 18600 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5612 | 18700 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5642 | 18800 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5672 | 18900 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5702 | 19000 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5732 | 19100 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5762 | 19200 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5792 | 19300 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5822 | 19400 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5852 | 19500 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5882 | 19600 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5912 | 19700 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5942 | 19800 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5972 | 19900 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6002 | 20000 | 0.0124 | 0.2803 | 0.5772 | 0.8907 | 0.4668 | 0.8676 | 0.6476 | 0.3364 | 0.6633 | 0.9129 | 0.3240 | 0.4412 | 0.7460 | 0.5781 | 0.5948 | | 0.6032 | 20100 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6062 | 20200 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6092 | 20300 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6122 | 20400 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6152 | 20500 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6182 | 20600 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6212 | 20700 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6242 | 20800 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6272 | 20900 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6302 | 21000 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6332 | 21100 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6362 | 21200 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6392 | 21300 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6422 | 21400 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6452 | 21500 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6482 | 21600 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6512 | 21700 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6542 | 21800 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6572 | 21900 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6602 | 22000 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6632 | 22100 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6662 | 22200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6692 | 22300 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6722 | 22400 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6752 | 22500 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6782 | 22600 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6812 | 22700 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6842 | 22800 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6872 | 22900 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6902 | 23000 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6932 | 23100 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6962 | 23200 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6992 | 23300 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7022 | 23400 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7052 | 23500 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7082 | 23600 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7112 | 23700 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7142 | 23800 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7172 | 23900 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7202 | 24000 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7232 | 24100 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7262 | 24200 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7292 | 24300 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7322 | 24400 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7352 | 24500 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7382 | 24600 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7412 | 24700 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7442 | 24800 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7472 | 24900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7502 | 25000 | 0.012 | 0.2905 | 0.5880 | 0.9048 | 0.4640 | 0.8575 | 0.6645 | 0.3255 | 0.6545 | 0.9043 | 0.3302 | 0.4645 | 0.7485 | 0.5655 | 0.5971 | | 0.7532 | 25100 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7562 | 25200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7592 | 25300 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7622 | 25400 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7652 | 25500 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7682 | 25600 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7712 | 25700 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7742 | 25800 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7772 | 25900 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7802 | 26000 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7832 | 26100 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7862 | 26200 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7892 | 26300 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7922 | 26400 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7952 | 26500 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7982 | 26600 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8012 | 26700 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8042 | 26800 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8072 | 26900 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8102 | 27000 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8132 | 27100 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8162 | 27200 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8192 | 27300 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8222 | 27400 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8252 | 27500 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8282 | 27600 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8312 | 27700 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8342 | 27800 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8372 | 27900 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8402 | 28000 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8432 | 28100 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8462 | 28200 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8492 | 28300 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8522 | 28400 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8552 | 28500 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8582 | 28600 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8612 | 28700 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8642 | 28800 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8672 | 28900 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8702 | 29000 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8732 | 29100 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8762 | 29200 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8792 | 29300 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8822 | 29400 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8852 | 29500 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8882 | 29600 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8912 | 29700 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8943 | 29800 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8973 | 29900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9003 | 30000 | 0.0113 | 0.2876 | 0.5814 | 0.9050 | 0.4612 | 0.8581 | 0.6531 | 0.3296 | 0.6661 | 0.9075 | 0.3311 | 0.4576 | 0.7464 | 0.5860 | 0.5978 | | 0.9033 | 30100 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9063 | 30200 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9093 | 30300 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9123 | 30400 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9153 | 30500 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9183 | 30600 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9213 | 30700 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9243 | 30800 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9273 | 30900 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9303 | 31000 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9333 | 31100 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9363 | 31200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9393 | 31300 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9423 | 31400 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9453 | 31500 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9483 | 31600 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9513 | 31700 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9543 | 31800 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9573 | 31900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9603 | 32000 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9633 | 32100 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9663 | 32200 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9693 | 32300 | 0.0111 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9723 | 32400 | 0.0111 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9753 | 32500 | 0.0111 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9783 | 32600 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9813 | 32700 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9843 | 32800 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9873 | 32900 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9903 | 33000 | 0.0111 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9933 | 33100 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9963 | 33200 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9993 | 33300 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
### Framework Versions - Python: 3.11.13 - Sentence Transformers: 4.0.2 - PyLate: 1.2.0 - Transformers: 4.48.2 - PyTorch: 2.6.0+cu124 - Accelerate: 1.9.0 - Datasets: 4.0.0 - Tokenizers: 0.21.4 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084" } ``` #### PyLate ```bibtex @misc{PyLate, title={PyLate: Flexible Training and Retrieval for Late Interaction Models}, author={Chaffin, Antoine and Sourty, Raphaƫl}, url={https://github.com/lightonai/pylate}, year={2024} } ```