The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationError
Exception: TypeError
Message: Couldn't cast array of type
struct<generated_data: int64, data_fps: int64, video_fps: int64, commit_hash: string, total_frames: int64, mocap_raw_data_source: struct<capMachine: string, operator: string, object: string, gesture: string, sequence: string>, mano_hand_shape: fixed_size_list<element: float>[10], object_move_start_frame: int64, object_move_end_frame: int64, train_info: struct<reward_value: float, current_step: int64, trajectory_length: int64>>
to
{'generated_data': Value('int64'), 'data_fps': Value('int32'), 'video_fps': Value('int32'), 'commit_hash': Value('string'), 'mocap_raw_data_source': {'capMachine': Value('string'), 'operator': Value('string'), 'object': Value('string'), 'gesture': Value('string'), 'sequence': Value('string')}, 'total_frames': Value('int32'), 'alignment_rmse_mean': Value('float32'), 'mano_hand_shape': List(Value('float32'), length=10), 'camera1': List(Value('float32'), length=6), 'camera2': List(Value('float32'), length=6)}
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1404, in compute_config_parquet_and_info_response
fill_builder_info(builder, hf_endpoint=hf_endpoint, hf_token=hf_token, validate=validate)
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 577, in fill_builder_info
) = retry_validate_get_features_num_examples_size_and_compression_ratio(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 496, in retry_validate_get_features_num_examples_size_and_compression_ratio
validate(pf)
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 534, in validate
raise TooBigRowGroupsError(
worker.job_runners.config.parquet_and_info.TooBigRowGroupsError: Parquet file has too big row groups. First row group has 1863260755 which exceeds the limit of 300000000
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1815, in _prepare_split_single
for _, table in generator:
^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 691, in wrapped
for item in generator(*args, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/parquet/parquet.py", line 106, in _generate_tables
yield f"{file_idx}_{batch_idx}", self._cast_table(pa_table)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/parquet/parquet.py", line 73, in _cast_table
pa_table = table_cast(pa_table, self.info.features.arrow_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2224, in cast_table_to_schema
cast_array_to_feature(
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1795, in wrapper
return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2092, in cast_array_to_feature
raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
TypeError: Couldn't cast array of type
struct<generated_data: int64, data_fps: int64, video_fps: int64, commit_hash: string, total_frames: int64, mocap_raw_data_source: struct<capMachine: string, operator: string, object: string, gesture: string, sequence: string>, mano_hand_shape: fixed_size_list<element: float>[10], object_move_start_frame: int64, object_move_end_frame: int64, train_info: struct<reward_value: float, current_step: int64, trajectory_length: int64>>
to
{'generated_data': Value('int64'), 'data_fps': Value('int32'), 'video_fps': Value('int32'), 'commit_hash': Value('string'), 'mocap_raw_data_source': {'capMachine': Value('string'), 'operator': Value('string'), 'object': Value('string'), 'gesture': Value('string'), 'sequence': Value('string')}, 'total_frames': Value('int32'), 'alignment_rmse_mean': Value('float32'), 'mano_hand_shape': List(Value('float32'), length=10), 'camera1': List(Value('float32'), length=6), 'camera2': List(Value('float32'), length=6)}
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1427, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 993, in stream_convert_to_parquet
builder._prepare_split(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1858, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
trajectory_meta_data
dict | sequence_info
dict |
|---|---|
{"generated_data":1761026212,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026213,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026214,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026214,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026222,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026224,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026225,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026227,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026232,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
{"generated_data":1761026234,"data_fps":100,"video_fps":30,"commit_hash":"257ce3bf8dc6153d5e73cf4609(...TRUNCATED) | {"timestamp":[0.0,1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,1(...TRUNCATED) |
DexCanvas: Dexterous Manipulation Dataset v0.1
⚠️ TEST RELEASE: This is a preview version containing 1% of the full dataset. Contact force data is not included in v0.1.
DexCanvas is a large-scale hybrid dataset for robotic hand-object interaction research, combining real human demonstrations with physics-validated simulation data.
Dataset Statistics (v0.1 Test Release)
- Total Frames: ~30 million multi-view RGB-D frames
- Total Duration: ~70 hours of dexterous hand-object interactions
- Real Demonstrations: ~0.7 hours of human mocap data (1/100 of collected data)
- Expansion Ratio: 100× from real to simulated data
- Manipulation Types: 21 types based on Cutkosky taxonomy
- Objects: 30 objects (geometric primitives + YCB objects)
- Capture Rate: 100 Hz optical motion capture
Manipulation Coverage
The dataset spans four primary grasp categories:
- Power Grasps: Full-hand wrapping grips
- Intermediate Grasps: Mixed precision-power combinations
- Precision Grasps: Fingertip-based manipulation
- In-Hand Manipulation: Object reorientation and repositioning
All 21 manipulation types follow the Cutkosky grasp taxonomy.
Data Modalities
Each frame includes:
- RGB-D Data: Multi-view color and depth images
- Hand Pose: MANO hand parameters with high-precision tracking
- Object State: 6-DoF pose and object wrenches
- Annotations: Per-frame labels and metadata
Note: Contact force data is not included in v0.1. Contact forces will be available in future releases.
Data Pipeline
The dataset is generated through three stages:
- Real Capture: Optical motion capture of human demonstrations at 30 Hz
- Force Reconstruction: RL-based physics simulation to infer contact forces
- Physics Validation: Verification of contact points, forces, and object dynamics
This hybrid approach provides contact information impossible to observe directly in real-world scenarios while maintaining physical accuracy.
Installation
pip install datasets huggingface_hub
For image processing and visualization:
pip install pillow numpy torch
Authenticate with HuggingFace (required for private datasets):
huggingface-cli login
Or set your token as an environment variable:
export HF_TOKEN="your_token_here"
Quick Start
Data Structure
{
"trajectory_meta_data": {
"generated_data": "int",
"data_fps": "int",
"mocap_raw_data_source": {
"operator": "str",
"object": "str",
"gesture": "str"
},
"total_frames": "int",
"mano_hand_shape": "(10,)"
//...
},
"sequence_info": {
"timestamp": "(T,)",
"hand_joint": {
"position": "(T, 3)",
"rotation": "(T, 3)",
"finger_pose": "(T, 48)"
},
"object_info": {
"pose": "(T, 6)"
},
"mano_model_output": {
"joints": "(T, 63)"
}
}
}
Visualization
Visualize trajectories using the mocap_loader:
# Install dependencies
pip install open3d trimesh scipy
# Visualize trajectory
python -m hand_trajectory_loader.examples.visualize_trajectory \
dataset.parquet 0 \
--mano-model assets/mano/models/MANO_RIGHT.pkl \
--object assets/objects/cube1.stl \
--show-joints
Controls: SPACE pause/resume, M toggle hand mesh, O toggle object, Q quit
Version Information
v0.1 (Test Release) includes:
- 1% of collected real human demonstration data
- MANO hand parameters
- Object pose data
- Manipulation type annotations
Coming in future releases:
- Complete dataset (100× larger than v0.1)
- Contact force data with physics validation
- Additional objects and manipulation types
- Extended annotations and metadata
Contact
Research Collaboration Academic inquiries: [email protected]
Business Inquiries Business collaboration: [email protected]
Website https://www.dex-robot.com/en https://dexcanvas.github.io/
Citation
@article{dexcanvas2025,
title={DexCanvas: A Large-Scale Hybrid Dataset for Dexterous Manipulation},
author={DexRobot Team},
year={2025},
eprint={2510.15786},
archivePrefix={arXiv},
url={https://arxiv.org/abs/2510.15786}
}
License
This dataset is released under the Open Database License (ODbL).
Developed by DexRobot Team Last Updated: October 2025
- Downloads last month
- 79