Dataset Viewer
The dataset viewer is not available for this dataset.
Job manager crashed while running this job (missing heartbeats).
Error code:   JobManagerCrashedError

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

3EED: Ground Everything Everywhere in 3D β€” Dataset Card

A cross-platform, multi-modal 3D visual grounding dataset spanning vehicle, drone, and quadruped platforms, with synchronized RGB, LiDAR, and language annotations. This page documents how to obtain and organize the dataset from HuggingFace and how to connect it with the training/evaluation code in the 3EED repository.

1. What’s Included

  • Platforms: vehicle, drone, quad (quadruped)
  • Modalities: LiDAR point clouds, RGB images, language referring expressions, metadata
  • Splits: train/val files per platform under splits/
  • Task: 3D visual grounding (language β†’ 3D box)

2. Download

You can download via:

  • HuggingFace CLI:

    pip install -U "huggingface_hub[cli]"
    huggingface-cli download 3EED/3EED --repo-type dataset --local-dir ./3eed_dataset
    
  • Python:

    from huggingface_hub import snapshot_download
    snapshot_download(repo_id="3EED/3EED", repo_type="dataset", local_dir="./3eed_dataset")
    
  • Git (LFS):

    git lfs install
    git clone https://huggingface.co/datasets/3EED/3EED 3eed_dataset
    

3. Directory Structure

  • Place or verify the files under data/3eed/ in your project. A minimal expected layout (paths shown relative to the repo root):

    data/3eed/
    β”œβ”€β”€ drone/                     # Drone platform data
    β”‚   β”œβ”€β”€ scene-0001/
    β”‚   β”‚   β”œβ”€β”€ 0000_0/
    β”‚   β”‚   β”‚   β”œβ”€β”€ image.jpg
    β”‚   β”‚   β”‚   β”œβ”€β”€ lidar.bin
    β”‚   β”‚   β”‚   └── meta_info.json
    β”‚   β”‚   └── ...
    β”‚   └── ...
    β”œβ”€β”€ quad/                      # Quadruped platform data
    β”‚   β”œβ”€β”€ scene-0001/
    β”‚   └── ...
    β”œβ”€β”€ waymo/                     # Vehicle platform data
    β”‚   β”œβ”€β”€ scene-0001/
    β”‚   └── ...
    └── splits/                    # Train/val split files
        β”œβ”€β”€ drone_train.txt
        β”œβ”€β”€ drone_val.txt
        β”œβ”€β”€ quad_train.txt
        β”œβ”€β”€ quad_val.txt
        β”œβ”€β”€ waymo_train.txt
        └── waymo_val.txt
    

4. Connect to the Codebase

  • Clone the code repository:

    git clone https://github.com/iris0329/3eed
    cd 3eed
    
  • Link or copy the downloaded dataset to data/3eed/:

    # Example: if your dataset is in ../3eed_dataset
    ln -s ../3eed_dataset data/3eed
    

Now you can follow the Installation, Custom CUDA Operators, Training, and Evaluation sections in the GitHub README:

  • Train on all platforms:

    bash scripts/train_3eed.sh
    
  • Train on a single platform:

    bash scripts/train_waymo.sh   # vehicle
    bash scripts/train_drone.sh   # drone
    bash scripts/train_quad.sh    # quadruped
    
  • Evaluate:

    bash scripts/val_3eed.sh
    bash scripts/val_waymo.sh
    bash scripts/val_drone.sh
    bash scripts/val_quad.sh
    

Remember to set the correct --checkpoint_path inside the evaluation scripts.

5. Data Splits

We provide official splits under data/3eed/splits/:

  • *_train.txt: training scene/frame indices for each platform
  • *_val.txt: validation scene/frame indices for each platform

Please keep these files unchanged for fair comparison with the baselines and reported results.

6. Usage Tips

  • Storage: LiDAR+RGB data can be large; ensure sufficient disk space and use Git LFS for partial sync if needed.
  • IO Throughput: For faster training/evaluation, place frequently used scenes on fast local SSDs or use caching.
  • Reproducibility: Use the exact environment files and scripts from the code repo; platform unions vs. single-platform runs are controlled by the provided scripts.

7. License

  • Dataset license: Apache-2.0 (see the header of this page).
  • The code repository uses Apache-2.0; refer to the LICENSE in the GitHub repo.

If you plan to use, redistribute, or modify the dataset, please review the dataset license and any upstream source licenses (e.g., Waymo Open Dataset, M3ED).

8. Citation

  • If you find 3EED helpful, please cite:
    @inproceedings{li2025_3eed,
      title     = {3EED: Ground Everything Everywhere in 3D},
      author    = {Rong Li and Yuhao Dong and Tianshuai Hu and Ao Liang and 
                   Youquan Liu and Dongyue Lu and Liang Pan and Lingdong Kong and 
                   Junwei Liang and Ziwei Liu},
      booktitle = {Advances in Neural Information Processing Systems (NeurIPS)
                   Datasets and Benchmarks Track},
      year      = {2025}
    }
    

9. Acknowledgements

We acknowledge the following upstream sources which make this dataset possible:

  • Waymo Open Dataset (vehicle platform)
  • M3ED (drone and quadruped platforms)

For baseline implementations and evaluation code, please refer to the GitHub repository.

Downloads last month
9