BERTić-SentiComments-SR-Polarity

BERTić-SentiComments-SR-Polarity is a variant of the BERTić model, fine-tuned on the task of polarity detection in Serbian short texts. It differentiates between negative (-) and positive (+) texts. The model was fine-tuned for 5 epochs on the SentiComments.SR dataset.

Benchmarking

This model was evaluated on the task of polarity detection in short texts in Serbian from the SentiComments.SR dataset and compared to multilingual BERT. Different lengths of fine-tuning were considered, ranging from 1 to 5 epochs. Linear classifiers relying on bag-of-words (BOW) and/or bag-of-embeddings (BOE) features were used as baselines.

Since the dataset is imbalanced, weighted F1 measure was utilized as the performance metric. Model fine-tuning and evaluation were performed using 10-fold stratified cross-validation. The code and data to run these experiments are available on the SentiComments.SR GitHub repository.

Results

Model Weighted F1
Baseline - Linear classifier with BOW features 0.782
Baseline - Linear classifier with BOE features 0.783
Baseline - Linear classifier with BOW+BOE features 0.783
Multilingual BERT, 1 epoch 0.733
BERTić-SentiComments-SR-Polarity, 1 epoch 0.882
Multilingual BERT, 3 epochs 0.777
BERTić-SentiComments-SR-Polarity, 3 epochs 0.889
Multilingual BERT, 5 epochs 0.778
BERTić-SentiComments-SR-Polarity, 5 epochs 0.889

References

If you wish to use this model in your paper or project, please cite the following papers:

Downloads last month
7
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ICEF-NLP/bcms-bertic-senticomments-sr-polarity

Finetuned
(11)
this model