Update README.md
Browse files
README.md
CHANGED
|
@@ -2,30 +2,44 @@
|
|
| 2 |
This file contains all essay responses with their associated human-assigned scores. It also includes metadata about the evaluation context and student groups.
|
| 3 |
|
| 4 |
Column explanations:
|
|
|
|
| 5 |
Question_id: ID from 1 to 12.
|
|
|
|
| 6 |
Course_Name: The course associated with the essay.
|
|
|
|
| 7 |
Group_id: Identifier for student groupings within each course.
|
|
|
|
| 8 |
Gender: Gender of the student. (Note: male and female students were taught separately.)
|
|
|
|
| 9 |
Exam_Type: Specifies if the exam was traditional or online.
|
|
|
|
| 10 |
Essay_id: Unique ID for each essay.
|
|
|
|
| 11 |
Essay: Full text of the essay.
|
|
|
|
| 12 |
Rubric_a1–a4: Scores from the first evaluator (course instructor), based on question-specific criteria.
|
|
|
|
| 13 |
Final_Score_A: Sum of the rubric scores from the first evaluator.
|
|
|
|
| 14 |
Rubric_b1–b4: Scores from the second independent evaluator.
|
|
|
|
| 15 |
Final_Score_B: Sum of the rubric scores from the second evaluator.
|
| 16 |
|
| 17 |
|
| 18 |
📄 Second File: AR-AES Dataset - Question List & Rubric (Arabic)
|
| 19 |
Arabic list of essay questions and their corresponding evaluation criteria.
|
| 20 |
|
|
|
|
| 21 |
📄 Third File: AR-AES Dataset - Question List & Rubric (English)
|
| 22 |
English translation of the essay questions and rubric.
|
| 23 |
|
|
|
|
| 24 |
📄 Fourth File: AR-AES Dataset - Typical Answers
|
| 25 |
A reference set of model answers in Arabic and English for each question.
|
| 26 |
|
| 27 |
|
| 28 |
-
Licensing & Citation
|
|
|
|
| 29 |
This dataset is licensed under Creative Commons Attribution 4.0 International (CC BY 4.0).
|
| 30 |
You are free to use, share, and adapt the dataset—even for commercial purposes—as long as you give appropriate credit to the authors.
|
| 31 |
|
|
@@ -33,7 +47,10 @@ If you use or reference this dataset in your work, please cite:
|
|
| 33 |
|
| 34 |
@article{ghazawi2024automated,
|
| 35 |
title={Automated essay scoring in Arabic: a dataset and analysis of a BERT-based system},
|
|
|
|
| 36 |
author={Ghazawi, Rayed and Simpson, Edwin},
|
|
|
|
| 37 |
journal={arXiv preprint arXiv:2407.11212},
|
|
|
|
| 38 |
year={2024}
|
| 39 |
}
|
|
|
|
| 2 |
This file contains all essay responses with their associated human-assigned scores. It also includes metadata about the evaluation context and student groups.
|
| 3 |
|
| 4 |
Column explanations:
|
| 5 |
+
|
| 6 |
Question_id: ID from 1 to 12.
|
| 7 |
+
|
| 8 |
Course_Name: The course associated with the essay.
|
| 9 |
+
|
| 10 |
Group_id: Identifier for student groupings within each course.
|
| 11 |
+
|
| 12 |
Gender: Gender of the student. (Note: male and female students were taught separately.)
|
| 13 |
+
|
| 14 |
Exam_Type: Specifies if the exam was traditional or online.
|
| 15 |
+
|
| 16 |
Essay_id: Unique ID for each essay.
|
| 17 |
+
|
| 18 |
Essay: Full text of the essay.
|
| 19 |
+
|
| 20 |
Rubric_a1–a4: Scores from the first evaluator (course instructor), based on question-specific criteria.
|
| 21 |
+
|
| 22 |
Final_Score_A: Sum of the rubric scores from the first evaluator.
|
| 23 |
+
|
| 24 |
Rubric_b1–b4: Scores from the second independent evaluator.
|
| 25 |
+
|
| 26 |
Final_Score_B: Sum of the rubric scores from the second evaluator.
|
| 27 |
|
| 28 |
|
| 29 |
📄 Second File: AR-AES Dataset - Question List & Rubric (Arabic)
|
| 30 |
Arabic list of essay questions and their corresponding evaluation criteria.
|
| 31 |
|
| 32 |
+
|
| 33 |
📄 Third File: AR-AES Dataset - Question List & Rubric (English)
|
| 34 |
English translation of the essay questions and rubric.
|
| 35 |
|
| 36 |
+
|
| 37 |
📄 Fourth File: AR-AES Dataset - Typical Answers
|
| 38 |
A reference set of model answers in Arabic and English for each question.
|
| 39 |
|
| 40 |
|
| 41 |
+
Licensing & Citation:
|
| 42 |
+
|
| 43 |
This dataset is licensed under Creative Commons Attribution 4.0 International (CC BY 4.0).
|
| 44 |
You are free to use, share, and adapt the dataset—even for commercial purposes—as long as you give appropriate credit to the authors.
|
| 45 |
|
|
|
|
| 47 |
|
| 48 |
@article{ghazawi2024automated,
|
| 49 |
title={Automated essay scoring in Arabic: a dataset and analysis of a BERT-based system},
|
| 50 |
+
|
| 51 |
author={Ghazawi, Rayed and Simpson, Edwin},
|
| 52 |
+
|
| 53 |
journal={arXiv preprint arXiv:2407.11212},
|
| 54 |
+
|
| 55 |
year={2024}
|
| 56 |
}
|