Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError
Exception: ArrowNotImplementedError
Message: Cannot write struct type 'hashes' with no child field to Parquet. Consider adding a dummy child field.
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
writer.write_table(table)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 712, in write_table
self._build_writer(inferred_schema=pa_table.schema)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
self.pa_writer = pq.ParquetWriter(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
self.writer = _parquet.ParquetWriter(
^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'hashes' with no child field to Parquet. Consider adding a dummy child field.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1847, in _prepare_split_single
num_examples, num_bytes = writer.finalize()
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 731, in finalize
self._build_writer(self.schema)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
self.pa_writer = pq.ParquetWriter(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
self.writer = _parquet.ParquetWriter(
^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'hashes' with no child field to Parquet. Consider adding a dummy child field.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1450, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 993, in stream_convert_to_parquet
builder._prepare_split(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1858, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
shards
list | version
int64 |
|---|---|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4104701,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4104201,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4104701,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4103201,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4105201,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4108701,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 4104201,
"hashes": {}
},
"samples": 500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
[
{
"column_encodings": [
"str",
"bytes"
],
"column_names": [
"set",
"tokens"
],
"column_sizes": [
null
],
"compression": null,
"format": "mds",
"hashes": [],
"raw_data": {
"basename": "shard.00000.mds",
"bytes": 28733701,
"hashes": {}
},
"samples": 3500,
"size_limit": 67108864,
"version": 2,
"zip_data": null
}
] | 2
|
README.md exists but content is empty.
- Downloads last month
- 44