File size: 1,650 Bytes
57dc459 859d810 a498ceb 859d810 a498ceb 859d810 ea2417b a498ceb 859d810 ea2417b 859d810 c926719 ea2417b a498ceb 859d810 a498ceb 866ff4c a498ceb 74185ae ea2417b a498ceb 859d810 a498ceb 859d810 a498ceb 57dc459 ea2417b 57dc459 ea2417b c926719 ea2417b 57dc459 c926719 ea2417b c926719 57dc459 c926719 ea2417b 57dc459 c926719 a498ceb ea2417b a498ceb c926719 ea2417b c926719 ea2417b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
license: apache-2.0
datasets:
- Pravesh390/country-capital-mixed
language:
- en
library_name: transformers
pipeline_tag: text2text-generation
tags:
- qlora
- flan-t5
- prompt-tuning
- question-answering
- hallucination
- robust-qa
- country-capital
model-index:
- name: flan-t5-qlora-countryqa-v1
results:
- task:
type: text-generation
name: Text Generation
dataset:
type: Pravesh390/country-capital-mixed
name: Country-Capital Mixed QA
metrics:
- type: bleu
value: 92.5
- type: rouge
value: 87.3
---
# π§ FLAN-T5 QLoRA (Prompt Tuned) - Country Capital QA
This model is a fine-tuned version of `google/flan-t5-base` using **QLoRA** and **Prompt Tuning** on a hybrid QA dataset.
## π Highlights
- π Correct & incorrect (hallucinated) QA pairs
- βοΈ Trained using 4-bit QLoRA with PEFT
- π§ Prompt tuning enables parameter-efficient adaptation
## ποΈ Training
- Base Model: `google/flan-t5-base`
- Method: **QLoRA** + **Prompt Tuning** with PEFT
- Quantization: 4-bit NF4
- Frameworks: π€ Transformers, PEFT, Accelerate
- Evaluation: BLEU = 92.5, ROUGE = 87.3
## π Dataset
Mixture of 20 correct and 3 incorrect QA samples from `Pravesh390/country-capital-mixed`.
## π¦ Usage
```python
from transformers import pipeline
pipe = pipeline("text2text-generation", model="Pravesh390/flan-t5-qlora-countryqa-v1")
pipe("What is the capital of Brazil?")
```
## π Intended Use
- Evaluate hallucinations in QA systems
- Robust model development for real-world QA
- Academic research or education
## π·οΈ License
Apache 2.0 β Free for research and commercial use.
|