SentenceTransformer based on BAAI/bge-base-en-v1.5

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("ayushexel/emb-bge-base-en-v1.5-squad-8-epochs")
# Run inference
sentences = [
    'What did the composition of the cardinals consist of?',
    'Pope Sixtus V limited the number of cardinals to 70, comprising six cardinal bishops, 50 cardinal priests, and 14 cardinal deacons. Starting in the pontificate of Pope John XXIII, that limit has been exceeded. At the start of 1971, Pope Paul VI set the number of cardinal electors at a maximum of 120, but set no limit on the number of cardinals generally. He also established a maximum age of eighty years for electors. His action deprived twenty-five living cardinals, including the three living cardinals elevated by Pope Pius XI, of the right to participate in a conclave.[citation needed] Popes can dispense from church laws and have sometimes brought the number of cardinals under the age of 80 to more than 120. Pope Paul VI also increased the number of cardinal bishops by giving that rank to patriarchs of the Eastern Catholic Churches.',
    'Pope Sixtus V limited the number of cardinals to 70, comprising six cardinal bishops, 50 cardinal priests, and 14 cardinal deacons. Starting in the pontificate of Pope John XXIII, that limit has been exceeded. At the start of 1971, Pope Paul VI set the number of cardinal electors at a maximum of 120, but set no limit on the number of cardinals generally. He also established a maximum age of eighty years for electors. His action deprived twenty-five living cardinals, including the three living cardinals elevated by Pope Pius XI, of the right to participate in a conclave.[citation needed] Popes can dispense from church laws and have sometimes brought the number of cardinals under the age of 80 to more than 120. Pope Paul VI also increased the number of cardinal bishops by giving that rank to patriarchs of the Eastern Catholic Churches.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.4066

Training Details

Training Dataset

Unnamed Dataset

  • Size: 44,288 training samples
  • Columns: question, context, and negative
  • Approximate statistics based on the first 1000 samples:
    question context negative
    type string string string
    details
    • min: 6 tokens
    • mean: 14.58 tokens
    • max: 37 tokens
    • min: 34 tokens
    • mean: 150.61 tokens
    • max: 512 tokens
    • min: 27 tokens
    • mean: 153.22 tokens
    • max: 481 tokens
  • Samples:
    question context negative
    How many judges were originally planned for American Idol? The show had originally planned on having four judges following the Pop Idol format; however, only three judges had been found by the time of the audition round in the first season, namely Randy Jackson, Paula Abdul and Simon Cowell. A fourth judge, radio DJ Stryker, was originally chosen but he dropped out citing "image concerns". In the second season, New York radio personality Angie Martinez had been hired as a fourth judge but withdrew only after a few days of auditions due to not being comfortable with giving out criticism. The show decided to continue with the three judges format until season eight. All three original judges stayed on the judging panel for eight seasons. On February 14, 2009, The Walt Disney Company debuted "The American Idol Experience" at its Disney's Hollywood Studios theme park at the Walt Disney World Resort in Florida. In this live production, co-produced by 19 Entertainment, park guests chose from a list of songs and auditioned privately for Disney cast members. Those selected then performed on a stage in a 1000-seat theater replicating the Idol set. Three judges, whose mannerisms and style mimicked those of the real Idol judges, critiqued the performances. Audience members then voted for their favorite performer. There were several preliminary-round shows during the day that culminated in a "finals" show in the evening where one of the winners of the previous rounds that day was selected as the overall winner. The winner of the finals show received a "Dream Ticket" that granted them front-of-the-line privileges at any future American Idol audition. The attraction closed on August 30, 2014.
    What genre of music did season ten American Idol contestant Lauren Alaina sing? The two finalists in 2011 were Lauren Alaina and Scotty McCreery, both teenage country singers. McCreery won the competition on May 25, being the youngest male winner and the fourth male in a row to win American Idol. McCreery released his first single, "I Love You This Big", as his coronation song, and Alaina released "Like My Mother Does". McCreery's debut album, Clear as Day, became the first debut album by an Idol winner to reach No. 1 on the US Billboard 200 since Ruben Studdard's Soulful in 2003, and he became the youngest male artist to reach No. 1 on the Billboard 200. The impact of American Idol is also strongly felt in musical theatre, where many of Idol alumni have forged successful careers. The striking effect of former American Idol contestants on Broadway has been noted and commented on. The casting of a popular Idol contestant can lead to significantly increased ticket sales. Other alumni have gone on to work in television and films, the most notable being Jennifer Hudson who, on the recommendation of the Idol vocal coach Debra Byrd, won a role in Dreamgirls and subsequently received an Academy Award for her performance.
    What was responsible for creating thousands of scientific, technological, and knowledge-based businesses? With the emergence and growth of several science parks throughout the world that helped create many thousands of scientific, technological and knowledge-based businesses, Portugal started to develop several science parks across the country. These include the Taguspark (in Oeiras), the Coimbra iParque (in Coimbra), the biocant (in Cantanhede), the Madeira Tecnopolo (in Funchal), Sines Tecnopolo (in Sines), Tecmaia (in Maia) and Parkurbis (in Covilhã). Companies locate in the Portuguese science parks to take advantage of a variety of services ranging from financial and legal advice through to marketing and technological support. Certain technological inventions of the period – whether of Arab or Chinese origin, or unique European innovations – were to have great influence on political and social developments, in particular gunpowder, the printing press and the compass. The introduction of gunpowder to the field of battle affected not only military organisation, but helped advance the nation state. Gutenberg's movable type printing press made possible not only the Reformation, but also a dissemination of knowledge that would lead to a gradually more egalitarian society. The compass, along with other innovations such as the cross-staff, the mariner's astrolabe, and advances in shipbuilding, enabled the navigation of the World Oceans, and the early phases of colonialism. Other inventions had a greater impact on everyday life, such as eyeglasses and the weight-driven clock.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 5,000 evaluation samples
  • Columns: question, context, and negative_1
  • Approximate statistics based on the first 1000 samples:
    question context negative_1
    type string string string
    details
    • min: 6 tokens
    • mean: 14.64 tokens
    • max: 52 tokens
    • min: 28 tokens
    • mean: 153.83 tokens
    • max: 510 tokens
    • min: 28 tokens
    • mean: 152.96 tokens
    • max: 512 tokens
  • Samples:
    question context negative_1
    What was the name of the first sulfonamine antibiotic? Ehrlich’s approach of systematically varying the chemical structure of synthetic compounds and measuring the effects of these changes on biological activity was pursued broadly by industrial scientists, including Bayer scientists Josef Klarer, Fritz Mietzsch, and Gerhard Domagk. This work, also based in the testing of compounds available from the German dye industry, led to the development of Prontosil, the first representative of the sulfonamide class of antibiotics. Compared to arsphenamine, the sulfonamides had a broader spectrum of activity and were far less toxic, rendering them useful for infections caused by pathogens such as streptococci. In 1939, Domagk received the Nobel Prize in Medicine for this discovery. Nonetheless, the dramatic decrease in deaths from infectious diseases that occurred prior to World War II was primarily the result of improved public health measures such as clean water and less crowded housing, and the impact of anti-infective drugs and vaccines was sign... The first sulfonamide and first commercially available antibacterial, Prontosil, was developed by a research team led by Gerhard Domagk in 1932 at the Bayer Laboratories of the IG Farben conglomerate in Germany. Domagk received the 1939 Nobel Prize for Medicine for his efforts. Prontosil had a relatively broad effect against Gram-positive cocci, but not against enterobacteria. Research was stimulated apace by its success. The discovery and development of this sulfonamide drug opened the era of antibacterials.
    Who disregarded warnings about dams in the area? An article in Science suggested that the construction and filling of the Zipingpu Dam may have triggered the earthquake. The chief engineer of the Sichuan Geology and Mineral Bureau said that the sudden shift of a huge quantity of water into the region could have relaxed the tension between the two sides of the fault, allowing them to move apart, and could have increased the direct pressure on it, causing a violent rupture. The effect was "25 times more" than a year's worth of natural stress from tectonic movement. The government had disregarded warnings about so many large-scale dam projects in a seismically active area. Researchers have been denied access to seismological and geological data to examine the cause of the quake further. An article in Science suggested that the construction and filling of the Zipingpu Dam may have triggered the earthquake. The chief engineer of the Sichuan Geology and Mineral Bureau said that the sudden shift of a huge quantity of water into the region could have relaxed the tension between the two sides of the fault, allowing them to move apart, and could have increased the direct pressure on it, causing a violent rupture. The effect was "25 times more" than a year's worth of natural stress from tectonic movement. The government had disregarded warnings about so many large-scale dam projects in a seismically active area. Researchers have been denied access to seismological and geological data to examine the cause of the quake further.
    What annual ceremony do Freemasons have? The bulk of Masonic ritual consists of degree ceremonies. Candidates for Freemasonry are progressively initiated into Freemasonry, first in the degree of Entered Apprentice. Some time later, in a separate ceremony, they will be passed to the degree of Fellowcraft, and finally they will be raised to the degree of Master Mason. In all of these ceremonies, the candidate is entrusted with passwords, signs and grips peculiar to his new rank. Another ceremony is the annual installation of the Master and officers of the Lodge. In some jurisdictions Installed Master is valued as a separate rank, with its own secrets to distinguish its members. In other jurisdictions, the grade is not recognised, and no inner ceremony conveys new secrets during the installation of a new Master of the Lodge. Freemasonry consists of fraternal organisations that trace their origins to the local fraternities of stonemasons, which from the end of the fourteenth century regulated the qualifications of stonemasons and their interaction with authorities and clients. The degrees of freemasonry retain the three grades of medieval craft guilds, those of Apprentice, Journeyman or fellow (now called Fellowcraft), and Master Mason. These are the degrees offered by Craft (or Blue Lodge) Freemasonry. Members of these organisations are known as Freemasons or Masons. There are additional degrees, which vary with locality and jurisdiction, and are usually administered by different bodies than the craft degrees.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • num_train_epochs: 8
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 8
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss Validation Loss gooqa-dev_cosine_accuracy
-1 -1 - - 0.3564
0.2890 100 0.7631 0.8236 0.3766
0.5780 200 0.4816 0.7701 0.3962
0.8671 300 0.4197 0.7316 0.4012
1.1561 400 0.3274 0.7281 0.4104
1.4451 500 0.2834 0.7302 0.4078
1.7341 600 0.2677 0.7327 0.4036
2.0231 700 0.2654 0.7161 0.4122
2.3121 800 0.1517 0.7344 0.4094
2.6012 900 0.1558 0.7256 0.4174
2.8902 1000 0.1604 0.7256 0.4110
3.1792 1100 0.1214 0.7413 0.4110
3.4682 1200 0.1024 0.7434 0.4124
3.7572 1300 0.1064 0.7384 0.4126
4.0462 1400 0.1024 0.7465 0.4114
4.3353 1500 0.0742 0.7551 0.4180
4.6243 1600 0.0756 0.7664 0.4128
4.9133 1700 0.0761 0.7566 0.4136
5.2023 1800 0.0645 0.7629 0.4126
5.4913 1900 0.0589 0.7709 0.4160
5.7803 2000 0.061 0.7709 0.4122
6.0694 2100 0.0575 0.7735 0.4116
6.3584 2200 0.0484 0.7798 0.4134
6.6474 2300 0.0503 0.7820 0.4098
6.9364 2400 0.0505 0.7778 0.4086
7.2254 2500 0.0449 0.7826 0.4100
7.5145 2600 0.0449 0.7838 0.4082
7.8035 2700 0.0442 0.7864 0.4070
-1 -1 - - 0.4066

Framework Versions

  • Python: 3.11.0
  • Sentence Transformers: 4.0.1
  • Transformers: 4.50.3
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.5.2
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ayushexel/emb-bge-base-en-v1.5-squad-8-epochs

Finetuned
(429)
this model

Evaluation results