DistilBERT Sentiment Classifier (Hybrid-Sentiment-Engine)

This model is a fine-tuned version of distilbert-base-uncased for binary sentiment classification (positive/negative).
It achieves approximately 90% F1 score on the IMDb dataset.

The model is part of the Hybrid-Sentiment-Engine, a project combining classical ML (TF-IDF + Logistic Regression) with transformer-based deep learning for flexible, high-accuracy sentiment analysis.

πŸ”₯ Model Details

  • Base model: distilbert-base-uncased
  • Task: Sentiment Analysis (binary)
  • Dataset: IMDb reviews
  • Languages: English
  • Framework: Hugging Face Transformers
  • License: MIT

πŸ“Š Performance

Metric Score
F1 Score ~90%
Accuracy ~90%
Precision ~89%
Recall ~90%

πŸ“₯ Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification

model_id = "HarshGahlaut/distilb-sentiment-best"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSequenceClassification.from_pretrained(model_id)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for HarshGahlaut/distilb-sentiment-best

Finetuned
(10326)
this model