DistilBERT Sentiment Classifier (Hybrid-Sentiment-Engine)
This model is a fine-tuned version of distilbert-base-uncased for binary sentiment classification (positive/negative).
It achieves approximately 90% F1 score on the IMDb dataset.
The model is part of the Hybrid-Sentiment-Engine, a project combining classical ML (TF-IDF + Logistic Regression) with transformer-based deep learning for flexible, high-accuracy sentiment analysis.
π₯ Model Details
- Base model: distilbert-base-uncased
- Task: Sentiment Analysis (binary)
- Dataset: IMDb reviews
- Languages: English
- Framework: Hugging Face Transformers
- License: MIT
π Performance
| Metric | Score |
|---|---|
| F1 Score | ~90% |
| Accuracy | ~90% |
| Precision | ~89% |
| Recall | ~90% |
π₯ Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
model_id = "HarshGahlaut/distilb-sentiment-best"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSequenceClassification.from_pretrained(model_id)
Model tree for HarshGahlaut/distilb-sentiment-best
Base model
distilbert/distilbert-base-uncased