SentenceTransformer based on thebajajra/RexBERT-base

This is a sentence-transformers model finetuned from thebajajra/RexBERT-base on the nomic-embed-unsupervised-data dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
queries = [
    "Corners are still lifting.",
]
documents = [
    'Hello, I got my ender 3 a little over a year ago and have gotten many successful prints off of my machine. \n\nI have always had a problem with the corners of my prints lifting. I originally used a glass plate. That by itself was horrible, but then I added hairspray, and that worked. The problem was that on long prints, corners still lifted.\n\nAfter doing this for around 5 months I switched to a PEI sheet.\n\nThis worked comparably as well as the glass/hairspray combo, except the corners STILL LIFT on long prints.\n\nNow I have a PEI sheet on boro glass with an EZABL attached and the corners of my prints are STILL LIFTING.\n\nI don\'t know what i could possibly be doing wrong. The bed must be level. I get beautiful first layers, which I have tried to "smudge" around during printing and I can confirm that the plastic is being layed down solidly.\n\nIf anyone could enlighten me as to what is going on I would be thrilled.\n\nI do have my first layer printing at 30% speed with 150% layer width with the print cooling fan off as well. Printing PLA at 200C tool temp, 60C bed.',
    'These are awesome quart jars. They have a beautiful color, and I use them for storing soups, nuts and homemade nut milk. I would purchase them again.',
    'Great product. I purchased this item becuase my wrists would ache after triceps day at the gym. I would never be able to straighten my wrist and this helped in fixing that issue.',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[0.5487, 0.0457, 0.1961]])

Training Details

Training Dataset

nomic-embed-unsupervised-data

  • Dataset: nomic-embed-unsupervised-data at 917bae6
  • Size: 221,599,363 training samples
  • Columns: query and document
  • Approximate statistics based on the first 1000 samples:
    query document
    type string string
    details
    • min: 6 tokens
    • mean: 34.17 tokens
    • max: 1024 tokens
    • min: 8 tokens
    • mean: 166.07 tokens
    • max: 1024 tokens
  • Samples:
    query document
    Effect of steam reforming on methane-fueled chemical looping combustion with Cu-based oxygen carrier Abstract The reduction characteristics of Cu-based oxygen carrier with H 2 , CO and CH 4 were investigated using a fixed bed reactor, TPR and TGA. Results showed that temperatures for the complete reduction of Cu-based oxygen carrier with H 2 and CO are 300 °C and 225 °C, respectively, while the corresponding temperature with CH 4 is 650 °C. The carbon deposition from CH 4 occurred at over 550 °C. CO-chemisorption experiments were also conducted on the oxygen carrier, and it was indicated that Cu-based oxygen carrier sinter seriously at 700 °C. In order to lower the required reduction temperature of oxygen carriers, a new chemical looping combustion (CLC) process with CH 4 steam reforming has been presented in this paper. The basic feasibility of the process was illustrated using CuO–SiO 2 . The new CLC process has the potential to replace the conventional gas-fired middle- and low-pressure steam and hot water boilers.
    who appointed onesicritus as chief pilot of the fleet by the king to hold a conference with the Indian philosophers or Gymnosophists, the details of which have been transmitted to us from his own account of the interview. It was Onesicritus, whom Alexander first sent to summon Dandamis to his court. When later Onesicritus returned empty-handed with the reply of Dandamis, the King went to forest to visit Dandamis. When Alexander constructed his fleet on the Hydaspes, he appointed Onesicritus to the important position of pilot of the king's ship, or chief pilot of the fleet (). Onesicritus held this position not only during the descent of the Indus,
    when did the madonna of foligno go to paris Madonna of Foligno hence the name. In 1799 it was carried to Paris, France by Napoleon. There, in 1802, the painting was transferred from panel to canvas by Hacquin and restored by Roser of Heidelberg. A note was made by the restorer: "Rapporto dei cittadini Guijon Vincent Tannay e Berthollet sul ristauro dei quadri di Raffaello conosciuto sotto il nome di Madonna di Foligno." In 1815, after the Battle of Waterloo, it was returned to Italy, where it was placed in the room with the Transfiguration in the Pinacoteca Vaticana of the Vatican Museum in the Vatican City. The painting is a "sacra
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Evaluation Dataset

nomic-embed-unsupervised-data

  • Dataset: nomic-embed-unsupervised-data at 917bae6
  • Size: 1,113,579 evaluation samples
  • Columns: query and document
  • Approximate statistics based on the first 1000 samples:
    query document
    type string string
    details
    • min: 5 tokens
    • mean: 31.98 tokens
    • max: 1024 tokens
    • min: 6 tokens
    • mean: 161.48 tokens
    • max: 1024 tokens
  • Samples:
    query document
    Concise methods for the synthesis of chiral polyoxazolines and their application in asymmetric hydrosilylation Seven polyoxazoline ligands were synthesized in high yield in a one-pot reaction by heating polycarboxylic acids or their esters and chiral β-amino alcohols under reflux with concomitant removal of water or the alcohol produced in the reaction. The method is much simpler and more efficient in comparison to those methods reported in the literature.The compounds were used as chiral ligands in the rhodium-catalyzed asymmetric hydrosilylation of aromatic ketones, and the effects of the linkers and the substituents present on the oxazoline rings on the yield and enantioselectivity investigated. Compound 2 was identified as the best ligand of this family for the hydrosilylation of aromatic ketones.
    On the road to a stronger public health workforce: visual tools to address complex challenges. The Public Health Workforce Taxonomy: Revisions and Recommendations for Implementation
    140mm Jetflo fan availability? I recently purchased a Nepton 280L, and would like to install an additional pair of 140mm Jetflo fans. Unfortunately they don't seem to be currently available, will they be in the future?

    Thank you so much!

    PS - I'm loving the cooling system!
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 384
  • per_device_eval_batch_size: 128
  • learning_rate: 1e-05
  • num_train_epochs: 4
  • warmup_steps: 1000
  • bf16: True
  • dataloader_num_workers: 20
  • dataloader_prefetch_factor: 4
  • ddp_find_unused_parameters: False

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 384
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 1e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 1000
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 20
  • dataloader_prefetch_factor: 4
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: False
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss
0.0014 100 4.5665 -
0.0028 200 2.223 -
0.0042 300 0.3767 -
0.0055 400 0.1622 -
0.0069 500 0.1154 -
0.0083 600 0.0934 -
0.0097 700 0.0797 -
0.0111 800 0.0704 -
0.0125 900 0.0625 -
0.0139 1000 0.0582 -
0.0152 1100 0.0535 -
0.0166 1200 0.0492 -
0.0180 1300 0.0463 -
0.0194 1400 0.044 -
0.0208 1500 0.0416 -
0.0222 1600 0.0395 -
0.0236 1700 0.0381 -
0.0250 1800 0.0367 -
0.0263 1900 0.0358 -
0.0277 2000 0.0345 -
0.0291 2100 0.0335 -
0.0305 2200 0.0319 -
0.0319 2300 0.0318 -
0.0333 2400 0.0304 -
0.0347 2500 0.0301 -
0.0360 2600 0.0291 -
0.0374 2700 0.0293 -
0.0388 2800 0.0281 -
0.0402 2900 0.0277 -
0.0416 3000 0.0266 -
0.0430 3100 0.0265 -
0.0444 3200 0.0261 -
0.0457 3300 0.0253 -
0.0471 3400 0.0256 -
0.0485 3500 0.0247 -
0.0499 3600 0.0239 -
0.0513 3700 0.0239 -
0.0527 3800 0.0235 -
0.0541 3900 0.0233 -
0.0555 4000 0.0229 -
0.0568 4100 0.0227 -
0.0582 4200 0.0226 -
0.0596 4300 0.0221 -
0.0610 4400 0.0219 -
0.0624 4500 0.0211 -
0.0638 4600 0.0212 -
0.0652 4700 0.021 -
0.0665 4800 0.0205 -
0.0679 4900 0.0202 -
0.0693 5000 0.0206 -
0.0707 5100 0.0199 -
0.0721 5200 0.0202 -
0.0735 5300 0.0194 -
0.0749 5400 0.0195 -
0.0762 5500 0.0189 -
0.0776 5600 0.0194 -
0.0790 5700 0.0189 -
0.0804 5800 0.0183 -
0.0818 5900 0.0184 -
0.0832 6000 0.0183 -
0.0846 6100 0.018 -
0.0859 6200 0.0178 -
0.0873 6300 0.018 -
0.0887 6400 0.0174 -
0.0901 6500 0.0175 -
0.0915 6600 0.0176 -
0.0929 6700 0.0171 -
0.0943 6800 0.0168 -
0.0957 6900 0.0174 -
0.0970 7000 0.0171 -
0.0984 7100 0.0169 -
0.0998 7200 0.0167 -
0.1012 7300 0.0165 -
0.1026 7400 0.0166 -
0.1040 7500 0.0162 -
0.1054 7600 0.0164 -
0.1067 7700 0.0159 -
0.1081 7800 0.0159 -
0.1095 7900 0.0162 -
0.1109 8000 0.0157 -
0.1123 8100 0.0157 -
0.1137 8200 0.0155 -
0.1151 8300 0.0154 -
0.1164 8400 0.0155 -
0.1178 8500 0.0154 -
0.1192 8600 0.015 -
0.1206 8700 0.0151 -
0.1220 8800 0.0149 -
0.1234 8900 0.015 -
0.1248 9000 0.0146 -
0.1262 9100 0.015 -
0.1275 9200 0.0148 -
0.1289 9300 0.0145 -
0.1303 9400 0.0146 -
0.1317 9500 0.0148 -
0.1331 9600 0.0143 -
0.1345 9700 0.0144 -
0.1359 9800 0.0142 -
0.1372 9900 0.0142 -
0.1386 10000 0.0141 -
0.1400 10100 0.0139 -
0.1414 10200 0.0141 -
0.1428 10300 0.0139 -
0.1442 10400 0.0136 -
0.1456 10500 0.0135 -
0.1469 10600 0.0135 -
0.1483 10700 0.0134 -
0.1497 10800 0.0136 -
0.1511 10900 0.0133 -
0.1525 11000 0.0135 -
0.1539 11100 0.0133 -
0.1553 11200 0.0134 -
0.1567 11300 0.0133 -
0.1580 11400 0.0134 -
0.1594 11500 0.013 -
0.1608 11600 0.0131 -
0.1622 11700 0.0129 -
0.1636 11800 0.0127 -
0.1650 11900 0.0129 -
0.1664 12000 0.0125 -
0.1677 12100 0.0129 -
0.1691 12200 0.013 -
0.1705 12300 0.013 -
0.1719 12400 0.013 -
0.1733 12500 0.0125 -
0.1747 12600 0.0125 -
0.1761 12700 0.0122 -
0.1774 12800 0.0124 -
0.1788 12900 0.0124 -
0.1802 13000 0.0121 -
0.1816 13100 0.0124 -
0.1830 13200 0.0122 -
0.1844 13300 0.0123 -
0.1858 13400 0.0121 -
0.1871 13500 0.012 -
0.1885 13600 0.0118 -
0.1899 13700 0.0119 -
0.1913 13800 0.0117 -
0.1927 13900 0.0119 -
0.1941 14000 0.0119 -
0.1955 14100 0.0117 -
0.1969 14200 0.0119 -
0.1982 14300 0.0116 -
0.1996 14400 0.0116 -
0.2 14427 - 0.0044
0.2010 14500 0.012 -
0.2024 14600 0.0116 -
0.2038 14700 0.0118 -
0.2052 14800 0.0116 -
0.2066 14900 0.0118 -
0.2079 15000 0.0118 -
0.2093 15100 0.0113 -
0.2107 15200 0.0114 -
0.2121 15300 0.0115 -
0.2135 15400 0.0116 -
0.2149 15500 0.0113 -
0.2163 15600 0.0115 -
0.2176 15700 0.0112 -
0.2190 15800 0.0112 -
0.2204 15900 0.0114 -
0.2218 16000 0.0111 -
0.2232 16100 0.0112 -
0.2246 16200 0.0111 -
0.2260 16300 0.011 -
0.2274 16400 0.011 -
0.2287 16500 0.0109 -
0.2301 16600 0.0106 -
0.2315 16700 0.011 -
0.2329 16800 0.011 -
0.2343 16900 0.0108 -
0.2357 17000 0.0106 -
0.2371 17100 0.0108 -
0.2384 17200 0.0107 -
0.2398 17300 0.0105 -
0.2412 17400 0.0107 -
0.2426 17500 0.011 -
0.2440 17600 0.0105 -
0.2454 17700 0.0107 -
0.2468 17800 0.0106 -
0.2481 17900 0.0108 -
0.2495 18000 0.0106 -
0.2509 18100 0.0105 -
0.2523 18200 0.0103 -
0.2537 18300 0.0104 -
0.2551 18400 0.0105 -
0.2565 18500 0.0103 -
0.2578 18600 0.0104 -
0.2592 18700 0.0103 -
0.2606 18800 0.0102 -
0.2620 18900 0.0101 -
0.2634 19000 0.0102 -
0.2648 19100 0.0103 -
0.2662 19200 0.01 -
0.2676 19300 0.0103 -
0.2689 19400 0.0101 -
0.2703 19500 0.0103 -
0.2717 19600 0.0101 -
0.2731 19700 0.0103 -
0.2745 19800 0.0102 -
0.2759 19900 0.0102 -
0.2773 20000 0.0103 -
0.2786 20100 0.0101 -
0.2800 20200 0.0102 -
0.2814 20300 0.0099 -
0.2828 20400 0.0099 -
0.2842 20500 0.0099 -
0.2856 20600 0.0098 -
0.2870 20700 0.0099 -
0.2883 20800 0.0097 -
0.2897 20900 0.0101 -
0.2911 21000 0.0098 -
0.2925 21100 0.0099 -
0.2939 21200 0.0099 -
0.2953 21300 0.0098 -
0.2967 21400 0.0096 -
0.2981 21500 0.0097 -
0.2994 21600 0.0097 -
0.3008 21700 0.0099 -
0.3022 21800 0.0098 -
0.3036 21900 0.0096 -
0.3050 22000 0.0097 -
0.3064 22100 0.0098 -
0.3078 22200 0.0094 -
0.3091 22300 0.0096 -
0.3105 22400 0.0095 -
0.3119 22500 0.0098 -
0.3133 22600 0.0096 -
0.3147 22700 0.0094 -
0.3161 22800 0.0095 -
0.3175 22900 0.0093 -
0.3188 23000 0.0093 -
0.3202 23100 0.0093 -
0.3216 23200 0.0094 -
0.3230 23300 0.0094 -
0.3244 23400 0.0093 -
0.3258 23500 0.0091 -
0.3272 23600 0.0093 -
0.3286 23700 0.0093 -
0.3299 23800 0.0093 -
0.3313 23900 0.0093 -
0.3327 24000 0.0093 -
0.3341 24100 0.009 -
0.3355 24200 0.0093 -
0.3369 24300 0.0089 -
0.3383 24400 0.0089 -
0.3396 24500 0.0092 -
0.3410 24600 0.009 -
0.3424 24700 0.0092 -
0.3438 24800 0.009 -
0.3452 24900 0.0091 -
0.3466 25000 0.0088 -
0.3480 25100 0.009 -
0.3493 25200 0.0089 -
0.3507 25300 0.0088 -
0.3521 25400 0.0089 -
0.3535 25500 0.0089 -
0.3549 25600 0.009 -
0.3563 25700 0.0092 -
0.3577 25800 0.0089 -
0.3590 25900 0.0089 -
0.3604 26000 0.009 -
0.3618 26100 0.0088 -
0.3632 26200 0.0088 -
0.3646 26300 0.0091 -
0.3660 26400 0.0088 -
0.3674 26500 0.0089 -
0.3688 26600 0.0087 -
0.3701 26700 0.0089 -
0.3715 26800 0.0087 -
0.3729 26900 0.0088 -
0.3743 27000 0.0086 -
0.3757 27100 0.0088 -
0.3771 27200 0.0087 -
0.3785 27300 0.0085 -
0.3798 27400 0.0085 -
0.3812 27500 0.0086 -
0.3826 27600 0.0088 -
0.3840 27700 0.0084 -
0.3854 27800 0.0086 -
0.3868 27900 0.0085 -
0.3882 28000 0.0085 -
0.3895 28100 0.0086 -
0.3909 28200 0.0085 -
0.3923 28300 0.0086 -
0.3937 28400 0.0088 -
0.3951 28500 0.0086 -
0.3965 28600 0.0085 -
0.3979 28700 0.0086 -
0.3993 28800 0.0085 -
0.4 28854 - 0.0031
0.4006 28900 0.0084 -
0.4020 29000 0.0084 -
0.4034 29100 0.0085 -
0.4048 29200 0.0083 -
0.4062 29300 0.0084 -
0.4076 29400 0.0084 -
0.4090 29500 0.0084 -
0.4103 29600 0.0082 -
0.4117 29700 0.0085 -
0.4131 29800 0.0083 -
0.4145 29900 0.0081 -
0.4159 30000 0.0084 -
0.4173 30100 0.0085 -
0.4187 30200 0.0081 -
0.4200 30300 0.0084 -
0.4214 30400 0.0084 -
0.4228 30500 0.0082 -
0.4242 30600 0.0084 -
0.4256 30700 0.0084 -
0.4270 30800 0.0082 -
0.4284 30900 0.0081 -
0.4297 31000 0.0081 -
0.4311 31100 0.0079 -
0.4325 31200 0.0082 -
0.4339 31300 0.0082 -
0.4353 31400 0.0082 -
0.4367 31500 0.0079 -
0.4381 31600 0.0079 -
0.4395 31700 0.0081 -
0.4408 31800 0.008 -
0.4422 31900 0.0081 -
0.4436 32000 0.0081 -
0.4450 32100 0.0081 -
0.4464 32200 0.0078 -
0.4478 32300 0.0079 -
0.4492 32400 0.0081 -
0.4505 32500 0.0081 -
0.4519 32600 0.0081 -
0.4533 32700 0.0079 -
0.4547 32800 0.0079 -
0.4561 32900 0.0079 -
0.4575 33000 0.0079 -
0.4589 33100 0.0079 -
0.4602 33200 0.0078 -
0.4616 33300 0.0077 -
0.4630 33400 0.008 -
0.4644 33500 0.0079 -
0.4658 33600 0.008 -
0.4672 33700 0.0079 -
0.4686 33800 0.0078 -
0.4700 33900 0.008 -
0.4713 34000 0.0077 -
0.4727 34100 0.0077 -
0.4741 34200 0.0078 -
0.4755 34300 0.0076 -
0.4769 34400 0.0078 -
0.4783 34500 0.0078 -
0.4797 34600 0.0078 -
0.4810 34700 0.0079 -
0.4824 34800 0.0078 -
0.4838 34900 0.0077 -
0.4852 35000 0.0075 -
0.4866 35100 0.0076 -
0.4880 35200 0.0078 -
0.4894 35300 0.0076 -
0.4907 35400 0.0078 -
0.4921 35500 0.0077 -
0.4935 35600 0.0076 -
0.4949 35700 0.0076 -
0.4963 35800 0.0077 -
0.4977 35900 0.0076 -
0.4991 36000 0.0077 -
0.5005 36100 0.0077 -
0.5018 36200 0.0077 -
0.5032 36300 0.0077 -
0.5046 36400 0.0076 -
0.5060 36500 0.0076 -
0.5074 36600 0.0077 -
0.5088 36700 0.0076 -
0.5102 36800 0.0075 -
0.5115 36900 0.0077 -
0.5129 37000 0.0076 -
0.5143 37100 0.0075 -
0.5157 37200 0.0074 -
0.5171 37300 0.0074 -
0.5185 37400 0.0075 -
0.5199 37500 0.0075 -
0.5212 37600 0.0074 -
0.5226 37700 0.0074 -
0.5240 37800 0.0072 -
0.5254 37900 0.0076 -
0.5268 38000 0.0075 -
0.5282 38100 0.0072 -
0.5296 38200 0.0074 -
0.5309 38300 0.0073 -
0.5323 38400 0.0073 -
0.5337 38500 0.0074 -
0.5351 38600 0.0073 -
0.5365 38700 0.0073 -
0.5379 38800 0.0074 -
0.5393 38900 0.0072 -
0.5407 39000 0.0076 -
0.5420 39100 0.0072 -
0.5434 39200 0.0073 -
0.5448 39300 0.0071 -
0.5462 39400 0.0072 -
0.5476 39500 0.0073 -
0.5490 39600 0.0074 -
0.5504 39700 0.0072 -
0.5517 39800 0.0072 -
0.5531 39900 0.0073 -
0.5545 40000 0.0071 -
0.5559 40100 0.0072 -
0.5573 40200 0.0072 -
0.5587 40300 0.0071 -
0.5601 40400 0.0072 -
0.5614 40500 0.0071 -
0.5628 40600 0.0073 -
0.5642 40700 0.0073 -
0.5656 40800 0.0072 -
0.5670 40900 0.0071 -
0.5684 41000 0.0073 -
0.5698 41100 0.0072 -
0.5712 41200 0.0071 -
0.5725 41300 0.0074 -
0.5739 41400 0.0072 -
0.5753 41500 0.0071 -
0.5767 41600 0.0071 -
0.5781 41700 0.007 -
0.5795 41800 0.0071 -
0.5809 41900 0.0071 -
0.5822 42000 0.0073 -
0.5836 42100 0.0071 -
0.5850 42200 0.0069 -
0.5864 42300 0.0071 -
0.5878 42400 0.0072 -
0.5892 42500 0.0073 -
0.5906 42600 0.0071 -
0.5919 42700 0.0071 -
0.5933 42800 0.0072 -
0.5947 42900 0.0071 -
0.5961 43000 0.0072 -
0.5975 43100 0.007 -
0.5989 43200 0.0072 -
0.6 43281 - 0.0026
0.6003 43300 0.0071 -
0.6016 43400 0.0069 -
0.6030 43500 0.007 -
0.6044 43600 0.0069 -
0.6058 43700 0.007 -
0.6072 43800 0.0068 -
0.6086 43900 0.0071 -
0.6100 44000 0.0069 -
0.6114 44100 0.0069 -
0.6127 44200 0.0069 -
0.6141 44300 0.0071 -
0.6155 44400 0.0071 -
0.6169 44500 0.007 -
0.6183 44600 0.0069 -
0.6197 44700 0.0069 -
0.6211 44800 0.007 -
0.6224 44900 0.0068 -
0.6238 45000 0.0069 -
0.6252 45100 0.0069 -
0.6266 45200 0.0069 -
0.6280 45300 0.0068 -
0.6294 45400 0.0069 -
0.6308 45500 0.007 -
0.6321 45600 0.0068 -
0.6335 45700 0.0068 -
0.6349 45800 0.0068 -
0.6363 45900 0.0069 -
0.6377 46000 0.007 -
0.6391 46100 0.0067 -
0.6405 46200 0.0066 -
0.6419 46300 0.0069 -
0.6432 46400 0.0068 -
0.6446 46500 0.007 -
0.6460 46600 0.0069 -
0.6474 46700 0.0069 -
0.6488 46800 0.0068 -
0.6502 46900 0.007 -
0.6516 47000 0.0069 -
0.6529 47100 0.0067 -
0.6543 47200 0.0068 -
0.6557 47300 0.0065 -
0.6571 47400 0.0067 -
0.6585 47500 0.007 -
0.6599 47600 0.0067 -
0.6613 47700 0.0067 -
0.6626 47800 0.0068 -
0.6640 47900 0.0067 -
0.6654 48000 0.0066 -
0.6668 48100 0.0069 -
0.6682 48200 0.0067 -
0.6696 48300 0.0067 -
0.6710 48400 0.0067 -
0.6724 48500 0.0069 -
0.6737 48600 0.0066 -
0.6751 48700 0.0066 -
0.6765 48800 0.0068 -
0.6779 48900 0.0067 -
0.6793 49000 0.0067 -
0.6807 49100 0.0068 -
0.6821 49200 0.0066 -
0.6834 49300 0.0067 -
0.6848 49400 0.0065 -
0.6862 49500 0.0067 -
0.6876 49600 0.0066 -
0.6890 49700 0.0065 -
0.6904 49800 0.0067 -
0.6918 49900 0.0066 -
0.6931 50000 0.0066 -
0.6945 50100 0.0066 -
0.6959 50200 0.0065 -
0.6973 50300 0.0068 -
0.6987 50400 0.0068 -
0.7001 50500 0.0066 -
0.7015 50600 0.0067 -
0.7028 50700 0.0068 -
0.7042 50800 0.0066 -
0.7056 50900 0.0065 -
0.7070 51000 0.0065 -
0.7084 51100 0.0065 -
0.7098 51200 0.0066 -
0.7112 51300 0.0065 -
0.7126 51400 0.0064 -
0.7139 51500 0.0063 -
0.7153 51600 0.0064 -
0.7167 51700 0.0063 -
0.7181 51800 0.0064 -
0.7195 51900 0.0065 -
0.7209 52000 0.0065 -
0.7223 52100 0.0065 -
0.7236 52200 0.0065 -
0.7250 52300 0.0065 -
0.7264 52400 0.0065 -
0.7278 52500 0.0065 -
0.7292 52600 0.0064 -
0.7306 52700 0.0065 -
0.7320 52800 0.0064 -
0.7333 52900 0.0064 -
0.7347 53000 0.0065 -
0.7361 53100 0.0063 -
0.7375 53200 0.0063 -
0.7389 53300 0.0064 -
0.7403 53400 0.0064 -
0.7417 53500 0.0064 -
0.7431 53600 0.0066 -
0.7444 53700 0.0064 -
0.7458 53800 0.0063 -
0.7472 53900 0.0064 -
0.7486 54000 0.0063 -
0.7500 54100 0.0063 -
0.7514 54200 0.0062 -
0.7528 54300 0.0064 -
0.7541 54400 0.0063 -
0.7555 54500 0.0063 -
0.7569 54600 0.0062 -
0.7583 54700 0.0063 -
0.7597 54800 0.0062 -
0.7611 54900 0.0062 -
0.7625 55000 0.0063 -
0.7638 55100 0.0065 -
0.7652 55200 0.0064 -
0.7666 55300 0.0062 -
0.7680 55400 0.0064 -
0.7694 55500 0.0063 -
0.7708 55600 0.0063 -
0.7722 55700 0.0062 -
0.7735 55800 0.0063 -
0.7749 55900 0.0062 -
0.7763 56000 0.0063 -
0.7777 56100 0.0064 -
0.7791 56200 0.0062 -
0.7805 56300 0.0065 -
0.7819 56400 0.006 -
0.7833 56500 0.0065 -
0.7846 56600 0.006 -
0.7860 56700 0.0062 -
0.7874 56800 0.0064 -
0.7888 56900 0.0061 -
0.7902 57000 0.0063 -
0.7916 57100 0.0062 -
0.7930 57200 0.0062 -
0.7943 57300 0.0062 -
0.7957 57400 0.0062 -
0.7971 57500 0.0062 -
0.7985 57600 0.0061 -
0.7999 57700 0.0061 -
0.8 57708 - 0.0022
0.8013 57800 0.0064 -
0.8027 57900 0.0062 -
0.8040 58000 0.0063 -
0.8054 58100 0.0061 -
0.8068 58200 0.0061 -
0.8082 58300 0.0063 -
0.8096 58400 0.0062 -
0.8110 58500 0.0062 -
0.8124 58600 0.0061 -
0.8138 58700 0.0062 -
0.8151 58800 0.0061 -
0.8165 58900 0.0061 -
0.8179 59000 0.0062 -
0.8193 59100 0.0062 -
0.8207 59200 0.0061 -
0.8221 59300 0.006 -
0.8235 59400 0.0061 -
0.8248 59500 0.006 -
0.8262 59600 0.006 -
0.8276 59700 0.0061 -
0.8290 59800 0.0062 -
0.8304 59900 0.0059 -
0.8318 60000 0.006 -
0.8332 60100 0.006 -
0.8345 60200 0.0061 -
0.8359 60300 0.006 -
0.8373 60400 0.0059 -
0.8387 60500 0.0061 -
0.8401 60600 0.006 -
0.8415 60700 0.0059 -
0.8429 60800 0.006 -
0.8443 60900 0.0061 -
0.8456 61000 0.0062 -
0.8470 61100 0.006 -
0.8484 61200 0.006 -
0.8498 61300 0.0059 -
0.8512 61400 0.0059 -
0.8526 61500 0.006 -
0.8540 61600 0.006 -
0.8553 61700 0.0059 -
0.8567 61800 0.006 -
0.8581 61900 0.0059 -
0.8595 62000 0.0059 -
0.8609 62100 0.0059 -
0.8623 62200 0.0059 -
0.8637 62300 0.0062 -
0.8650 62400 0.0061 -
0.8664 62500 0.0059 -
0.8678 62600 0.006 -
0.8692 62700 0.0061 -
0.8706 62800 0.0059 -
0.8720 62900 0.0061 -
0.8734 63000 0.006 -
0.8747 63100 0.0059 -
0.8761 63200 0.0059 -
0.8775 63300 0.0057 -
0.8789 63400 0.006 -
0.8803 63500 0.0058 -
0.8817 63600 0.0059 -
0.8831 63700 0.0058 -
0.8845 63800 0.0058 -
0.8858 63900 0.0059 -
0.8872 64000 0.0059 -
0.8886 64100 0.0059 -
0.8900 64200 0.0058 -
0.8914 64300 0.0058 -
0.8928 64400 0.006 -
0.8942 64500 0.0059 -
0.8955 64600 0.0059 -
0.8969 64700 0.0059 -
0.8983 64800 0.0058 -
0.8997 64900 0.0059 -
0.9011 65000 0.0059 -
0.9025 65100 0.0058 -
0.9039 65200 0.0058 -
0.9052 65300 0.0058 -
0.9066 65400 0.0059 -
0.9080 65500 0.0057 -
0.9094 65600 0.0057 -
0.9108 65700 0.0059 -
0.9122 65800 0.0059 -
0.9136 65900 0.0058 -
0.9150 66000 0.0058 -
0.9163 66100 0.0058 -
0.9177 66200 0.0057 -
0.9191 66300 0.0057 -
0.9205 66400 0.0059 -
0.9219 66500 0.0056 -
0.9233 66600 0.0058 -
0.9247 66700 0.0057 -
0.9260 66800 0.0058 -
0.9274 66900 0.0056 -
0.9288 67000 0.0057 -
0.9302 67100 0.0057 -
0.9316 67200 0.0055 -
0.9330 67300 0.0058 -
0.9344 67400 0.0058 -
0.9357 67500 0.0058 -
0.9371 67600 0.0057 -
0.9385 67700 0.0058 -
0.9399 67800 0.0056 -
0.9413 67900 0.0057 -
0.9427 68000 0.0058 -
0.9441 68100 0.0058 -
0.9454 68200 0.0057 -
0.9468 68300 0.0057 -
0.9482 68400 0.0057 -
0.9496 68500 0.0057 -
0.9510 68600 0.0057 -
0.9524 68700 0.0057 -
0.9538 68800 0.0059 -
0.9552 68900 0.0058 -
0.9565 69000 0.0058 -
0.9579 69100 0.0056 -
0.9593 69200 0.0057 -
0.9607 69300 0.0057 -
0.9621 69400 0.0057 -
0.9635 69500 0.0058 -
0.9649 69600 0.0056 -
0.9662 69700 0.0059 -
0.9676 69800 0.0055 -
0.9690 69900 0.0057 -
0.9704 70000 0.0054 -
0.9718 70100 0.0055 -
0.9732 70200 0.0055 -
0.9746 70300 0.0057 -
0.9759 70400 0.0057 -
0.9773 70500 0.0057 -
0.9787 70600 0.0056 -
0.9801 70700 0.0058 -
0.9815 70800 0.0054 -
0.9829 70900 0.0057 -
0.9843 71000 0.0056 -
0.9857 71100 0.0057 -
0.9870 71200 0.0057 -
0.9884 71300 0.0056 -
0.9898 71400 0.0057 -
0.9912 71500 0.0055 -
0.9926 71600 0.0055 -
0.9940 71700 0.0057 -
0.9954 71800 0.0057 -
0.9967 71900 0.0056 -
0.9981 72000 0.0058 -
0.9995 72100 0.0056 -
1.0 72135 - 0.0020

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.1
  • PyTorch: 2.8.0+cu129
  • Accelerate: 1.11.0
  • Datasets: 4.3.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
109
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for thebajajra/RexBERT-base-embed-pf-v0.2

Finetuned
(2)
this model
Finetunes
1 model

Dataset used to train thebajajra/RexBERT-base-embed-pf-v0.2