File size: 6,107 Bytes
7edc9b5 ec168b6 7edc9b5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
---
pretty_name: "JBCS2025: AES Experimental Logs and Predictions"
license: "cc-by-nc-4.0"
configs:
- config_name: evaluation_results
data_files:
- split: evaluation_results
path: evaluation_results-*.parquet
- config_name: bootstrap_confidence_intervals
data_files:
- split: bootstrap_confidence_intervals
path: bootstrap_confidence_intervals-*.parquet
tags:
- automatic-essay-scoring
- portuguese
- text-classification
---
# JBCS 2025: Experimental Artefacts for AES in Brazilian Portuguese
This repository contains all experimental artefacts (logs, configurations, predictions, and evaluation results) described in the paper:
> **Exploring the Usage of LLMs for Automatic Essay Scoring in Brazilian Portuguese Essays**
> André Barbosa, Igor Cataneo Silveira, Denis Deratani Mauá
> TODO
---
## 📦 What's in this dataset repo?
This dataset is **not a training dataset**. Instead, it provides comprehensive logs and outputs from experiments evaluating different language models for Automatic Essay Scoring (AES) tasks in Brazilian Portuguese.
Specifically, it contains:
- 🔁 **JSONL files**: raw predictions from each evaluated model.
- 📊 **CSV files**: detailed performance metrics (Quadratic Weighted Kappa, F1-score, etc.).
- ⚙️ **YAML files**: complete Hydra configurations for reproducibility.
- 📋 **Log files**: logs detailing each evaluation run.
---
## 📚 Related Collection
All models and datasets related to this work are available in the Hugging Face collection:
🔗 [**AES JBCS2025 Collection**](https://huggingface.co/collections/kamel-usp/jbcs2025-67d5e73a4b89c1f0c878159c)
---
## 📊 Evaluated Models
The table below lists all models trained and evaluated for each essay competence (C1 to C5), along with direct links to their Hugging Face repository pages:
| Model | Architecture | Training Type | Link |
|-------|--------------|---------------|------|
| mbert_base-C1 | Encoder-only | Fine-tuned | [mbert_base-C1](https://huggingface.co/kamel-usp/jbcs2025_mbert_base-C1) |
| mbert_base-C2 | Encoder-only | Fine-tuned | [mbert_base-C2](https://huggingface.co/kamel-usp/jbcs2025_mbert_base-C2) |
| mbert_base-C3 | Encoder-only | Fine-tuned | [mbert_base-C3](https://huggingface.co/kamel-usp/jbcs2025_mbert_base-C3) |
| mbert_base-C4 | Encoder-only | Fine-tuned | [mbert_base-C4](https://huggingface.co/kamel-usp/jbcs2025_mbert_base-C4) |
| mbert_base-C5 | Encoder-only | Fine-tuned | [mbert_base-C5](https://huggingface.co/kamel-usp/jbcs2025_mbert_base-C5) |
| bertimbau_base-C1 | Encoder-only | Fine-tuned | [bertimbau_base-C1](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_base-C1) |
| bertimbau_base-C2 | Encoder-only | Fine-tuned | [bertimbau_base-C2](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_base-C2) |
| bertimbau_base-C3 | Encoder-only | Fine-tuned | [bertimbau_base-C3](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_base-C3) |
| bertimbau_base-C4 | Encoder-only | Fine-tuned | [bertimbau_base-C4](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_base-C4) |
| bertimbau_base-C5 | Encoder-only | Fine-tuned | [bertimbau_base-C5](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_base-C5) |
| bertimbau_large-C1 | Encoder-only | Fine-tuned | [bertimbau_large-C1](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_large-C1) |
| bertimbau_large-C2 | Encoder-only | Fine-tuned | [bertimbau_large-C2](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_large-C2) |
| bertimbau_large-C3 | Encoder-only | Fine-tuned | [bertimbau_large-C3](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_large-C3) |
| bertimbau_large-C4 | Encoder-only | Fine-tuned | [bertimbau_large-C4](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_large-C4) |
| bertimbau_large-C5 | Encoder-only | Fine-tuned | [bertimbau_large-C5](https://huggingface.co/kamel-usp/jbcs2025_bertimbau_large-C5) |
| llama3-8b-C1 | Decoder-only | LoRA | [llama3-8b-C1](https://huggingface.co/kamel-usp/jbcs2025_llama3-8b-C1) |
| llama3-8b-C2 | Decoder-only | LoRA | [llama3-8b-C2](https://huggingface.co/kamel-usp/jbcs2025_llama3-8b-C2) |
| llama3-8b-C3 | Decoder-only | LoRA | [llama3-8b-C3](https://huggingface.co/kamel-usp/jbcs2025_llama3-8b-C3) |
| llama3-8b-C4 | Decoder-only | LoRA | [llama3-8b-C4](https://huggingface.co/kamel-usp/jbcs2025_llama3-8b-C4) |
| llama3-8b-C5 | Decoder-only | LoRA | [llama3-8b-C5](https://huggingface.co/kamel-usp/jbcs2025_llama3-8b-C5) |
| phi3.5-C1 | Decoder-only | LoRA | [phi3.5-C1](https://huggingface.co/kamel-usp/jbcs2025_phi3.5-C1) |
| phi3.5-C2 | Decoder-only | LoRA | [phi3.5-C2](https://huggingface.co/kamel-usp/jbcs2025_phi3.5-C2) |
| phi3.5-C3 | Decoder-only | LoRA | [phi3.5-C3](https://huggingface.co/kamel-usp/jbcs2025_phi3.5-C3) |
| phi3.5-C4 | Decoder-only | LoRA | [phi3.5-C4](https://huggingface.co/kamel-usp/jbcs2025_phi3.5-C4) |
| phi3.5-C5 | Decoder-only | LoRA | [phi3.5-C5](https://huggingface.co/kamel-usp/jbcs2025_phi3.5-C5) |
| phi4-C1 | Decoder-only | LoRA | [phi4-C1](https://huggingface.co/kamel-usp/jbcs2025_phi4-C1) |
| phi4-C2 | Decoder-only | LoRA | [phi4-C2](https://huggingface.co/kamel-usp/jbcs2025_phi4-C2) |
| phi4-C3 | Decoder-only | LoRA | [phi4-C3](https://huggingface.co/kamel-usp/jbcs2025_phi4-C3) |
| phi4-C4 | Decoder-only | LoRA | [phi4-C4](https://huggingface.co/kamel-usp/jbcs2025_phi4-C4) |
| phi4-C5 | Decoder-only | LoRA | [phi4-C5](https://huggingface.co/kamel-usp/jbcs2025_phi4-C5) |
🧠 Additionally, **API-only models** (e.g., DeepSeek-R1, ChatGPT-4o, Sabiá-3) were evaluated but are not hosted on the Hub. Their predictions and logs are still included in this dataset.
---
## 🧪 How to Use this Dataset
You can easily load the data using Hugging Face datasets library:
```python
from datasets import load_dataset
ds = load_dataset("kamel-usp/jbcs2025_experiments", split="runs")
```
---
## 📄 License and Citation
This work is licensed under the [Creative Commons Attribution 4.0 International License (CC-BY-4.0)](https://creativecommons.org/licenses/by/4.0/).
If you use these artefacts, please cite our paper:
```bibtex
TODO
```
|