|
|
--- |
|
|
language: en |
|
|
license: mit |
|
|
tags: |
|
|
- spiritual-ai |
|
|
- brahma-kumaris |
|
|
- murli |
|
|
- distilgpt2 |
|
|
- ultra-lite |
|
|
- peft |
|
|
- lora |
|
|
library_name: peft |
|
|
base_model: distilgpt2 |
|
|
--- |
|
|
|
|
|
# ποΈ Murli Assistant - DistilGPT-2 Ultra-Lite |
|
|
|
|
|
An **ultra-lightweight** spiritual AI assistant trained on Brahma Kumaris murli content. Perfect for free Colab and low-resource environments! |
|
|
|
|
|
## π― Why This Model? |
|
|
|
|
|
- **82M parameters** (30x smaller than Phi-2) |
|
|
- **RAM: ~1-2 GB** (fits easily in free Colab) |
|
|
- **Fast inference**: 0.5-1 second per response |
|
|
- **No quantization needed**: Runs in full precision |
|
|
- **Perfect for free tier**: No crashes, no OOM errors |
|
|
|
|
|
## Model Details |
|
|
|
|
|
- **Base Model**: DistilGPT-2 (82M parameters) |
|
|
- **Fine-tuning**: LoRA (Low-Rank Adaptation) |
|
|
- **Training Data**: 150 authentic murlis |
|
|
- **Training Examples**: 153+ |
|
|
- **Max Length**: 256 tokens |
|
|
- **LoRA Rank**: 4 |
|
|
|
|
|
## Usage |
|
|
|
|
|
### Quick Start (Colab) |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
from peft import PeftModel |
|
|
|
|
|
# Load base model |
|
|
tokenizer = AutoTokenizer.from_pretrained("distilgpt2") |
|
|
base_model = AutoModelForCausalLM.from_pretrained("distilgpt2") |
|
|
|
|
|
# Load LoRA adapter |
|
|
model = PeftModel.from_pretrained(base_model, "eswarankrishnamurthy/murli-assistant-distilgpt2-lite") |
|
|
|
|
|
# Chat function |
|
|
def chat(message): |
|
|
prompt = f"Q: {message}\nA:" |
|
|
inputs = tokenizer(prompt, return_tensors="pt") |
|
|
outputs = model.generate(**inputs, max_new_tokens=150) |
|
|
return tokenizer.decode(outputs[0], skip_special_tokens=True) |
|
|
|
|
|
# Try it |
|
|
response = chat("Om Shanti") |
|
|
print(response) |
|
|
``` |
|
|
|
|
|
### Use in Production |
|
|
|
|
|
See the full Colab notebook: `murli-distilgpt2-colab.ipynb` |
|
|
|
|
|
## Comparison with Other Models |
|
|
|
|
|
| Model | Parameters | RAM | Inference | Colab Free | |
|
|
|-------|------------|-----|-----------|------------| |
|
|
| **DistilGPT-2 (This)** | 82M | ~1-2 GB | 0.5-1s | β
Perfect | |
|
|
| Phi-2 | 2.7B | ~10 GB | 1-3s | β Crashes | |
|
|
| Phi-2 (4-bit) | 2.7B | ~3-4 GB | 1-3s | β οΈ Tight fit | |
|
|
|
|
|
## Advantages |
|
|
|
|
|
β
**Ultra-Lightweight**: 30x smaller than Phi-2 |
|
|
β
**Low RAM**: Only 1-2 GB needed |
|
|
β
**Fast Training**: 5-10 minutes |
|
|
β
**Fast Inference**: Sub-second responses |
|
|
β
**Free Colab**: Perfect fit, no crashes |
|
|
β
**Easy Deployment**: Simple integration |
|
|
β
**Good Quality**: Excellent for basic Q&A |
|
|
|
|
|
## Training Details |
|
|
|
|
|
[ |
|
|
"30x smaller than Phi-2", |
|
|
"Fits in free Colab RAM easily", |
|
|
"Fast training (5-10 min)", |
|
|
"Fast inference", |
|
|
"Good for basic Q&A" |
|
|
] |
|
|
|
|
|
## Example Responses |
|
|
|
|
|
**Q:** Om Shanti |
|
|
**A:** Om Shanti, sweet child! π I'm your Murli Helper. How can I guide you today? |
|
|
|
|
|
**Q:** What is soul consciousness? |
|
|
**A:** Soul consciousness is experiencing yourself as an eternal, pure soul with peace, love, and purity. Om Shanti π |
|
|
|
|
|
**Q:** Who is Baba? |
|
|
**A:** Baba is the Supreme Soul, the Ocean of Knowledge who teaches Raja Yoga through Brahma. Om Shanti π |
|
|
|
|
|
## Limitations |
|
|
|
|
|
- Shorter context (256 tokens vs Phi-2's 512) |
|
|
- Simpler responses compared to larger models |
|
|
- Best for focused Q&A, not long essays |
|
|
- Limited reasoning compared to billion-parameter models |
|
|
|
|
|
## License |
|
|
|
|
|
MIT License - Free to use and modify |
|
|
|
|
|
## Citation |
|
|
|
|
|
```bibtex |
|
|
@misc{murli-distilgpt2-lite, |
|
|
author = {eswarankrishnamurthy}, |
|
|
title = {Murli Assistant - DistilGPT-2 Ultra-Lite}, |
|
|
year = {2025}, |
|
|
publisher = {HuggingFace}, |
|
|
url = {https://huggingface.co/eswarankrishnamurthy/murli-assistant-distilgpt2-lite} |
|
|
} |
|
|
``` |
|
|
|
|
|
## Acknowledgments |
|
|
|
|
|
- Brahma Kumaris World Spiritual University for murli teachings |
|
|
- HuggingFace for model hosting |
|
|
- DistilGPT-2 team for the base model |
|
|
|
|
|
--- |
|
|
|
|
|
**Om Shanti! π** |
|
|
|