BERTić-SentiComments-SR-Six-way

BERTić-SentiComments-SR-Six-way is a variant of the BERTić model, fine-tuned on the task of six-way sentiment classification of Serbian short texts. It differentiates between objective-positive (+NS), objective-negative (-NS), ambiguous/mixed-positive (+M), ambiguous/mixed-negative (-M), clearly positive (+1), and clearly negative texts (-1). The model was fine-tuned for 5 epochs on the SentiComments.SR dataset.

Benchmarking

This model was evaluated on the task of six-way sentiment classification of short texts in Serbian from the SentiComments.SR dataset and compared to multilingual BERT. Different lengths of fine-tuning were considered, ranging from 1 to 5 epochs. Linear classifiers relying on bag-of-words (BOW) and/or bag-of-embeddings (BOE) features were used as baselines.

Since the dataset is imbalanced, weighted F1 measure was utilized as the performance metric. Model fine-tuning and evaluation were performed using 10-fold stratified cross-validation. The code and data to run these experiments are available on the SentiComments.SR GitHub repository.

Results

Model Weighted F1
Baseline - Linear classifier with BOW features 0.566
Baseline - Linear classifier with BOE features 0.557
Baseline - Linear classifier with BOW+BOE features 0.586
Multilingual BERT, 1 epoch 0.493
BERTić-SentiComments-SR-Six-way, 1 epoch 0.652
Multilingual BERT, 3 epochs 0.601
BERTić-SentiComments-SR-Six-way, 3 epochs 0.735
Multilingual BERT, 5 epochs 0.606
BERTić-SentiComments-SR-Six-way, 5 epochs 0.741

References

If you wish to use this model in your paper or project, please cite the following papers:

Downloads last month
7
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ICEF-NLP/bcms-bertic-senticomments-sr-sixway

Finetuned
(11)
this model