You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

SentenceTransformer based on Alibaba-NLP/gte-Qwen2-1.5B-instruct

This is a sentence-transformers model finetuned from Alibaba-NLP/gte-Qwen2-1.5B-instruct. It maps sentences & paragraphs to a 1536-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Alibaba-NLP/gte-Qwen2-1.5B-instruct
  • Maximum Sequence Length: 32768 tokens
  • Output Dimensionality: 1536 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 32768, 'do_lower_case': False, 'architecture': 'PeftModelForFeatureExtraction'})
  (1): Pooling({'word_embedding_dimension': 1536, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the ๐Ÿค— Hub
model = SentenceTransformer("praphul555/ohai_gte_qwen_1.5b_instruct")
# Run inference
queries = [
    "\u003cuser\u003e Hi, I need to cancel a test.\n\u003cassistant\u003e Of course, which test needs cancellation?\n\u003cuser\u003e The reflex lactic acid that we ordered.\n\u003cassistant\u003e Got it, I\u0027ll cancel it.\n\u003cuser\u003e Perfect, cancel it now.",
]
documents = [
    'Reflex Lactic Acid w/ Reflex, Plasma',
    'pt may return to room in 30 min if vital signs stable',
    'BH Social Services Assessment',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 1536] [3, 1536]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.7656,  0.0259, -0.0052]], dtype=torch.bfloat16)

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.7729
cosine_accuracy@3 0.9492
cosine_accuracy@5 0.9671
cosine_accuracy@10 0.9777
cosine_precision@1 0.7729
cosine_precision@3 0.3164
cosine_precision@5 0.1934
cosine_precision@10 0.0978
cosine_recall@1 0.7729
cosine_recall@3 0.9492
cosine_recall@5 0.9671
cosine_recall@10 0.9777
cosine_ndcg@10 0.8903
cosine_mrr@10 0.8606
cosine_map@100 0.861

Training Details

Training Dataset

Unnamed Dataset

  • Size: 204,376 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 2 tokens
    • mean: 20.11 tokens
    • max: 71 tokens
    • min: 2 tokens
    • mean: 13.66 tokens
    • max: 59 tokens
  • Samples:
    sentence_0 sentence_1
    Please renew the calcium gluconate tabs. calcium gluconate (calcium gluconate 500 mg oral tablet)
    r Measured Weight
    Order celiac antibodies tTG IgA and total IgA w/ reflex test. Celiac Antibodies tTG IgA + Total IgA w/Rflx to tTG IgG and DGP IgG
  • Loss: main.LoggingMNR with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss ir-val_cosine_ndcg@10
0.0063 10 - 0.7871
0.0125 20 - 0.7863
0.0188 30 - 0.7846
0.0250 40 - 0.7848
0.0313 50 - 0.7875
0.0376 60 - 0.7887
0.0438 70 - 0.7912
0.0501 80 - 0.7922
0.0564 90 - 0.7989
0.0626 100 - 0.8035
0.0689 110 - 0.8112
0.0751 120 - 0.8138
0.0814 130 - 0.8232
0.0877 140 - 0.8265
0.0939 150 - 0.8310
0.1002 160 - 0.8340
0.1064 170 - 0.8402
0.1127 180 - 0.8442
0.1190 190 - 0.8453
0.1252 200 - 0.8469
0.1315 210 - 0.8498
0.1378 220 - 0.8500
0.1440 230 - 0.8502
0.1503 240 - 0.8537
0.1565 250 - 0.8544
0.1628 260 - 0.8544
0.1691 270 - 0.8562
0.1753 280 - 0.8560
0.1816 290 - 0.8572
0.1879 300 - 0.8597
0.1941 310 - 0.8617
0.2004 320 - 0.8621
0.2066 330 - 0.8624
0.2129 340 - 0.8638
0.2192 350 - 0.8631
0.2254 360 - 0.8628
0.2317 370 - 0.8632
0.2379 380 - 0.8678
0.2442 390 - 0.8653
0.2505 400 - 0.8655
0.2567 410 - 0.8671
0.2630 420 - 0.8668
0.2693 430 - 0.8674
0.2755 440 - 0.8675
0.2818 450 - 0.8678
0.2880 460 - 0.8675
0.2943 470 - 0.8679
0.3006 480 - 0.8685
0.3068 490 - 0.8674
0.3131 500 0.3129 0.8686
0.3193 510 - 0.8698
0.3256 520 - 0.8695
0.3319 530 - 0.8693
0.3381 540 - 0.8713
0.3444 550 - 0.8707
0.3507 560 - 0.8713
0.3569 570 - 0.8692
0.3632 580 - 0.8716
0.3694 590 - 0.8728
0.3757 600 - 0.8716
0.3820 610 - 0.8747
0.3882 620 - 0.8730
0.3945 630 - 0.8736
0.4008 640 - 0.8744
0.4070 650 - 0.8746
0.4133 660 - 0.8751
0.4195 670 - 0.8746
0.4258 680 - 0.8727
0.4321 690 - 0.8735
0.4383 700 - 0.8737
0.4446 710 - 0.8726
0.4508 720 - 0.8714
0.4571 730 - 0.8735
0.4634 740 - 0.8735
0.4696 750 - 0.8719
0.4759 760 - 0.8721
0.4822 770 - 0.8734
0.4884 780 - 0.8729
0.4947 790 - 0.8732
0.5009 800 - 0.8739
0.5072 810 - 0.8731
0.5135 820 - 0.8740
0.5197 830 - 0.8723
0.5260 840 - 0.8715
0.5322 850 - 0.8742
0.5385 860 - 0.8738
0.5448 870 - 0.8742
0.5510 880 - 0.8727
0.5573 890 - 0.8718
0.5636 900 - 0.8735
0.5698 910 - 0.8747
0.5761 920 - 0.8743
0.5823 930 - 0.8725
0.5886 940 - 0.8741
0.5949 950 - 0.8726
0.6011 960 - 0.8724
0.6074 970 - 0.8740
0.6137 980 - 0.8751
0.6199 990 - 0.8751
0.6262 1000 0.1567 0.8760
0.6324 1010 - 0.8745
0.6387 1020 - 0.8729
0.6450 1030 - 0.8737
0.6512 1040 - 0.8778
0.6575 1050 - 0.8771
0.6637 1060 - 0.8765
0.6700 1070 - 0.8787
0.6763 1080 - 0.8780
0.6825 1090 - 0.8775
0.6888 1100 - 0.8761
0.6951 1110 - 0.8762
0.7013 1120 - 0.8770
0.7076 1130 - 0.8768
0.7138 1140 - 0.8778
0.7201 1150 - 0.8770
0.7264 1160 - 0.8777
0.7326 1170 - 0.8792
0.7389 1180 - 0.8796
0.7451 1190 - 0.8793
0.7514 1200 - 0.8795
0.7577 1210 - 0.8794
0.7639 1220 - 0.8770
0.7702 1230 - 0.8771
0.7765 1240 - 0.8770
0.7827 1250 - 0.8766
0.7890 1260 - 0.8772
0.7952 1270 - 0.8780
0.8015 1280 - 0.8796
0.8078 1290 - 0.8787
0.8140 1300 - 0.8793
0.8203 1310 - 0.8784
0.8265 1320 - 0.8794
0.8328 1330 - 0.8774
0.8391 1340 - 0.8805
0.8453 1350 - 0.8807
0.8516 1360 - 0.8793
0.8579 1370 - 0.8805
0.8641 1380 - 0.8792
0.8704 1390 - 0.8799
0.8766 1400 - 0.8789
0.8829 1410 - 0.8789
0.8892 1420 - 0.8805
0.8954 1430 - 0.8792
0.9017 1440 - 0.8822
0.9080 1450 - 0.8797
0.9142 1460 - 0.8793
0.9205 1470 - 0.8796
0.9267 1480 - 0.8791
0.9330 1490 - 0.8802
0.9393 1500 0.147 0.8804
0.9455 1510 - 0.8806
0.9518 1520 - 0.8786
0.9580 1530 - 0.8794
0.9643 1540 - 0.8799
0.9706 1550 - 0.8812
0.9768 1560 - 0.8802
0.9831 1570 - 0.8808
0.9894 1580 - 0.8808
0.9956 1590 - 0.8805
1.0 1597 - 0.8805
1.0019 1600 - 0.8806
1.0081 1610 - 0.8806
1.0144 1620 - 0.8804
1.0207 1630 - 0.8791
1.0269 1640 - 0.8804
1.0332 1650 - 0.8802
1.0394 1660 - 0.8823
1.0457 1670 - 0.8805
1.0520 1680 - 0.8808
1.0582 1690 - 0.8824
1.0645 1700 - 0.8801
1.0708 1710 - 0.8810
1.0770 1720 - 0.8812
1.0833 1730 - 0.8817
1.0895 1740 - 0.8812
1.0958 1750 - 0.8806
1.1021 1760 - 0.8820
1.1083 1770 - 0.8826
1.1146 1780 - 0.8825
1.1209 1790 - 0.8800
1.1271 1800 - 0.8805
1.1334 1810 - 0.8801
1.1396 1820 - 0.8819
1.1459 1830 - 0.8810
1.1522 1840 - 0.8800
1.1584 1850 - 0.8813
1.1647 1860 - 0.8817
1.1709 1870 - 0.8801
1.1772 1880 - 0.8809
1.1835 1890 - 0.8821
1.1897 1900 - 0.8838
1.1960 1910 - 0.8819
1.2023 1920 - 0.8821
1.2085 1930 - 0.8826
1.2148 1940 - 0.8841
1.2210 1950 - 0.8845
1.2273 1960 - 0.8829
1.2336 1970 - 0.8817
1.2398 1980 - 0.8833
1.2461 1990 - 0.8861
1.2523 2000 0.1343 0.8825
1.2586 2010 - 0.8821
1.2649 2020 - 0.8840
1.2711 2030 - 0.8836
1.2774 2040 - 0.8832
1.2837 2050 - 0.8806
1.2899 2060 - 0.8820
1.2962 2070 - 0.8805
1.3024 2080 - 0.8818
1.3087 2090 - 0.8834
1.3150 2100 - 0.8819
1.3212 2110 - 0.8854
1.3275 2120 - 0.8824
1.3338 2130 - 0.8811
1.3400 2140 - 0.8823
1.3463 2150 - 0.8810
1.3525 2160 - 0.8819
1.3588 2170 - 0.8816
1.3651 2180 - 0.8828
1.3713 2190 - 0.8828
1.3776 2200 - 0.8850
1.3838 2210 - 0.8833
1.3901 2220 - 0.8849
1.3964 2230 - 0.8834
1.4026 2240 - 0.8815
1.4089 2250 - 0.8821
1.4152 2260 - 0.8830
1.4214 2270 - 0.8822
1.4277 2280 - 0.8809
1.4339 2290 - 0.8831
1.4402 2300 - 0.8838
1.4465 2310 - 0.8840
1.4527 2320 - 0.8836
1.4590 2330 - 0.8827
1.4652 2340 - 0.8833
1.4715 2350 - 0.8836
1.4778 2360 - 0.8823
1.4840 2370 - 0.8823
1.4903 2380 - 0.8829
1.4966 2390 - 0.8823
1.5028 2400 - 0.8826
1.5091 2410 - 0.8839
1.5153 2420 - 0.8833
1.5216 2430 - 0.8830
1.5279 2440 - 0.8829
1.5341 2450 - 0.8828
1.5404 2460 - 0.8849
1.5466 2470 - 0.8827
1.5529 2480 - 0.8833
1.5592 2490 - 0.8832
1.5654 2500 0.1315 0.8841
1.5717 2510 - 0.8835
1.5780 2520 - 0.8839
1.5842 2530 - 0.8834
1.5905 2540 - 0.8847
1.5967 2550 - 0.8829
1.6030 2560 - 0.8815
1.6093 2570 - 0.8815
1.6155 2580 - 0.8815
1.6218 2590 - 0.8828
1.6281 2600 - 0.8839
1.6343 2610 - 0.8831
1.6406 2620 - 0.8848
1.6468 2630 - 0.8840
1.6531 2640 - 0.8821
1.6594 2650 - 0.8849
1.6656 2660 - 0.8833
1.6719 2670 - 0.8824
1.6781 2680 - 0.8826
1.6844 2690 - 0.8819
1.6907 2700 - 0.8831
1.6969 2710 - 0.8831
1.7032 2720 - 0.8845
1.7095 2730 - 0.8820
1.7157 2740 - 0.8814
1.7220 2750 - 0.8813
1.7282 2760 - 0.8830
1.7345 2770 - 0.8838
1.7408 2780 - 0.8833
1.7470 2790 - 0.8825
1.7533 2800 - 0.8814
1.7595 2810 - 0.8821
1.7658 2820 - 0.8817
1.7721 2830 - 0.8829
1.7783 2840 - 0.8837
1.7846 2850 - 0.8840
1.7909 2860 - 0.8838
1.7971 2870 - 0.8842
1.8034 2880 - 0.8867
1.8096 2890 - 0.8865
1.8159 2900 - 0.8863
1.8222 2910 - 0.8857
1.8284 2920 - 0.8846
1.8347 2930 - 0.8842
1.8410 2940 - 0.8860
1.8472 2950 - 0.8857
1.8535 2960 - 0.8851
1.8597 2970 - 0.8852
1.8660 2980 - 0.8852
1.8723 2990 - 0.8862
1.8785 3000 0.1249 0.8850
1.8848 3010 - 0.8843
1.8910 3020 - 0.8845
1.8973 3030 - 0.8862
1.9036 3040 - 0.8862
1.9098 3050 - 0.8848
1.9161 3060 - 0.8847
1.9224 3070 - 0.8865
1.9286 3080 - 0.8857
1.9349 3090 - 0.8874
1.9411 3100 - 0.8855
1.9474 3110 - 0.8873
1.9537 3120 - 0.8872
1.9599 3130 - 0.8856
1.9662 3140 - 0.8857
1.9724 3150 - 0.8862
1.9787 3160 - 0.8861
1.9850 3170 - 0.8861
1.9912 3180 - 0.8872
1.9975 3190 - 0.8869
2.0 3194 - 0.8850
2.0038 3200 - 0.8865
2.0100 3210 - 0.8854
2.0163 3220 - 0.8865
2.0225 3230 - 0.8847
2.0288 3240 - 0.8860
2.0351 3250 - 0.8883
2.0413 3260 - 0.8868
2.0476 3270 - 0.8842
2.0539 3280 - 0.8829
2.0601 3290 - 0.8830
2.0664 3300 - 0.8847
2.0726 3310 - 0.8840
2.0789 3320 - 0.8866
2.0852 3330 - 0.8845
2.0914 3340 - 0.8852
2.0977 3350 - 0.8864
2.1039 3360 - 0.8873
2.1102 3370 - 0.8877
2.1165 3380 - 0.8861
2.1227 3390 - 0.8865
2.1290 3400 - 0.8857
2.1353 3410 - 0.8857
2.1415 3420 - 0.8873
2.1478 3430 - 0.8866
2.1540 3440 - 0.8851
2.1603 3450 - 0.8865
2.1666 3460 - 0.8850
2.1728 3470 - 0.8837
2.1791 3480 - 0.8869
2.1853 3490 - 0.8861
2.1916 3500 0.1171 0.8861
2.1979 3510 - 0.8862
2.2041 3520 - 0.8889
2.2104 3530 - 0.8851
2.2167 3540 - 0.8878
2.2229 3550 - 0.8868
2.2292 3560 - 0.8858
2.2354 3570 - 0.8859
2.2417 3580 - 0.8844
2.2480 3590 - 0.8876
2.2542 3600 - 0.8881
2.2605 3610 - 0.8872
2.2668 3620 - 0.8846
2.2730 3630 - 0.8848
2.2793 3640 - 0.8846
2.2855 3650 - 0.8860
2.2918 3660 - 0.8864
2.2981 3670 - 0.8867
2.3043 3680 - 0.8860
2.3106 3690 - 0.8884
2.3168 3700 - 0.8881
2.3231 3710 - 0.8869
2.3294 3720 - 0.8869
2.3356 3730 - 0.8857
2.3419 3740 - 0.8866
2.3482 3750 - 0.8860
2.3544 3760 - 0.8872
2.3607 3770 - 0.8877
2.3669 3780 - 0.8887
2.3732 3790 - 0.8875
2.3795 3800 - 0.8883
2.3857 3810 - 0.8875
2.3920 3820 - 0.8879
2.3982 3830 - 0.8853
2.4045 3840 - 0.8877
2.4108 3850 - 0.8867
2.4170 3860 - 0.8879
2.4233 3870 - 0.8883
2.4296 3880 - 0.8897
2.4358 3890 - 0.8898
2.4421 3900 - 0.8865
2.4483 3910 - 0.8867
2.4546 3920 - 0.8866
2.4609 3930 - 0.8868
2.4671 3940 - 0.8866
2.4734 3950 - 0.8854
2.4796 3960 - 0.8884
2.4859 3970 - 0.8855
2.4922 3980 - 0.8858
2.4984 3990 - 0.8854
2.5047 4000 0.117 0.8861
2.5110 4010 - 0.8865
2.5172 4020 - 0.8855
2.5235 4030 - 0.8863
2.5297 4040 - 0.8864
2.5360 4050 - 0.8898
2.5423 4060 - 0.8890
2.5485 4070 - 0.8893
2.5548 4080 - 0.8902
2.5611 4090 - 0.8886
2.5673 4100 - 0.8882
2.5736 4110 - 0.8884
2.5798 4120 - 0.8876
2.5861 4130 - 0.8877
2.5924 4140 - 0.8879
2.5986 4150 - 0.8871
2.6049 4160 - 0.8881
2.6111 4170 - 0.8870
2.6174 4180 - 0.8883
2.6237 4190 - 0.8878
2.6299 4200 - 0.8890
2.6362 4210 - 0.8878
2.6425 4220 - 0.8897
2.6487 4230 - 0.8864
2.6550 4240 - 0.8871
2.6612 4250 - 0.8876
2.6675 4260 - 0.8856
2.6738 4270 - 0.8878
2.6800 4280 - 0.8884
2.6863 4290 - 0.8891
2.6925 4300 - 0.8891
2.6988 4310 - 0.8880
2.7051 4320 - 0.8865
2.7113 4330 - 0.8877
2.7176 4340 - 0.8859
2.7239 4350 - 0.8861
2.7301 4360 - 0.8853
2.7364 4370 - 0.8851
2.7426 4380 - 0.8868
2.7489 4390 - 0.8875
2.7552 4400 - 0.8869
2.7614 4410 - 0.8903

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.0.0
  • Transformers: 4.53.1
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.8.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

LoggingMNR

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for praphul555/ohai_gte_qwen_1.5b_instruct

Finetuned
(20)
this model

Evaluation results