SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/all-MiniLM-L6-v2
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: PeftModelForFeatureExtraction
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("manupande21/all-MiniLM-L6-v2-LoRA-finetuned-1M")
# Run inference
sentences = [
'what is the purpose of bands with braces',
'The Purpose of Rubber Bands for Braces. If you have gotten a professional consultation from an orthodontist, you may have been given rubber bands for braces; lots of doctors recommend using rubber bands in conjunction with braces to help straighten teeth quicker.',
'To successfully complete your orthodontic treatment plan, patients must work together with the orthodontist. The teeth and jaws can only move toward their corrected positions if the patient consistently wears the elastics (rubber bands), headgear or other appliances as prescribed. The following paragraphs describe the types of appliances that may be used during your treatment. Elastics (Rubber Bands) Wearing elastics (rubber bands) is the main way that Orthodontic treatment can improve the fit of your bite. Braces will straighten your teeth, but are unable to correct bite problems unless used in conjunction with elastics.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Triplet
- Dataset:
test-eval - Evaluated with
TripletEvaluator
| Metric | Value |
|---|---|
| cosine_accuracy | 0.9559 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 800,000 training samples
- Columns:
sentence_0,sentence_1, andsentence_2 - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 sentence_2 type string string string details - min: 4 tokens
- mean: 9.05 tokens
- max: 38 tokens
- min: 15 tokens
- mean: 80.09 tokens
- max: 232 tokens
- min: 21 tokens
- mean: 78.11 tokens
- max: 192 tokens
- Samples:
sentence_0 sentence_1 sentence_2 american modernism is characterized by _____.FROM MODERNISM TO POST-MODERNISM. POST-WAR ART IN AMERICA. After the Second World War, the art world was characterized by âtriumphalismâ in New York and a feeling of having won, not just a military war but also a cultural war. The French and their School of Paris had been routed. Also defeated was American Scene painting and its nativist illustrations of a naïve nation.noun. 1 a disease of sheep characterized by an unsteady gait and staggering, caused by infestation of the brain with tapeworms (Taenia caenuris).oun. 1 a disease of sheep characterized by an unsteady gait and staggering, caused by infestation of the brain with tapeworms (Taenia caenuris).where was sen tom cotton bornWritten By: born. Tom Cotton, in full Thomas Bryant Cotton (born May 13, 1977, Dardanelle, Arkansas, U.S.), American politician who was elected to the U.S. Senate as a Republican in 2014 and began his first term representing Arkansas the following year. He previously was a member of the U.S. House of Representatives (2013â15).Cotton Facts: 1 The Cotton Belt spans the southern half of the United States, stretching from Virginia to California. 2 Cotton production covers more than 14 million acres or about 22,000 square miles of the United States.3 Texas is the leading cotton-producing state, producing about 4.5 million bales of cotton a year.4 Cotton contributes over $1 5 ... Tex Cotton production covers more than 14 million acres or about 22,000 square miles of the United States. 2 Texas is the leading cotton-producing state, producing about 4.5 million bales of cotton a year. 3 Cotton contributes over $1 billion to the Texas economy, ranking only behind the beef industry in total cash receipts.what is ginger beerGosling's Ginger Beer. Crabbieâs Alcoholic Ginger Beer was first crafted in 1801. Ginger beer is carbonated, sweetened beverage produced in two versions: alcoholic brewed ginger beer (which includes home-brewed) or a carbonated soft drink flavored primarily with ginger and sweetened with sugar or artificial sweeteners.oday, ginger beer is usually produced as a soft drink. Ginger beer and ginger ale as soft drinks have been moderately popular in many parts of the world since they were introduced. The similarities and differences between ginger ale and ginger beer are discussed in the history section of the ginger ale article.Ginger is a commonly used flavoring agent and food product. Ginger is also available as an herbal supplement. The information contained in this leaflet refers to the use of ginger as an herbal supplement. - Loss:
TripletLosswith these parameters:{ "distance_metric": "TripletDistanceMetric.EUCLIDEAN", "triplet_margin": 5 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 512per_device_eval_batch_size: 512fp16: Truemulti_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 512per_device_eval_batch_size: 512per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 3max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Falsehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseeval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin
Training Logs
| Epoch | Step | Training Loss | test-eval_cosine_accuracy |
|---|---|---|---|
| 0.3199 | 500 | 3.4281 | - |
| 0.6398 | 1000 | 1.7972 | 0.9516 |
| 0.9597 | 1500 | 1.5857 | - |
| 1.0 | 1563 | - | 0.9536 |
| 1.2796 | 2000 | 1.5243 | 0.9542 |
| 1.5995 | 2500 | 1.481 | - |
| 1.9194 | 3000 | 1.4559 | 0.9553 |
| 2.0 | 3126 | - | 0.9559 |
Framework Versions
- Python: 3.11.5
- Sentence Transformers: 4.1.0
- Transformers: 4.41.0
- PyTorch: 2.7.0+cu126
- Accelerate: 1.7.0
- Datasets: 3.2.0
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
TripletLoss
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Model tree for manupande21/all-MiniLM-L6-v2-LoRA-finetuned-1M
Base model
sentence-transformers/all-MiniLM-L6-v2