File size: 5,063 Bytes
129fe3c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 |
---
license: artistic-2.0
tags:
- visual-grounding
- lidar
- 3d
---
# 3EED: Ground Everything Everywhere in 3D — Dataset Card
A cross-platform, multi-modal 3D visual grounding dataset spanning **vehicle**, **drone**, and **quadruped** platforms, with synchronized **RGB**, **LiDAR**, and **language** annotations. This page documents how to obtain and organize the dataset from HuggingFace and how to connect it with the training/evaluation code in the 3EED repository.
- Project Page: https://3eed.github.io
- Code (Baselines & Evaluation): https://github.com/iris0329/3eed
- Paper: https://arxiv.org/ (coming soon)
## 1. What’s Included
- Platforms: `vehicle`, `drone`, `quad` (quadruped)
- Modalities: LiDAR point clouds, RGB images, language referring expressions, metadata
- Splits: train/val files per platform under `splits/`
- Task: 3D visual grounding (language → 3D box)
## 2. Download
You can download via:
- HuggingFace CLI:
```bash
pip install -U "huggingface_hub[cli]"
huggingface-cli download 3EED/3EED --repo-type dataset --local-dir ./3eed_dataset
````
- Python:
```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="3EED/3EED", repo_type="dataset", local_dir="./3eed_dataset")
```
- Git (LFS):
```bash
git lfs install
git clone https://huggingface.co/datasets/3EED/3EED 3eed_dataset
```
## 3. Directory Structure
- Place or verify the files under `data/3eed/` in your project. A minimal expected layout (paths shown relative to the repo root):
```
data/3eed/
├── drone/ # Drone platform data
│ ├── scene-0001/
│ │ ├── 0000_0/
│ │ │ ├── image.jpg
│ │ │ ├── lidar.bin
│ │ │ └── meta_info.json
│ │ └── ...
│ └── ...
├── quad/ # Quadruped platform data
│ ├── scene-0001/
│ └── ...
├── waymo/ # Vehicle platform data
│ ├── scene-0001/
│ └── ...
└── splits/ # Train/val split files
├── drone_train.txt
├── drone_val.txt
├── quad_train.txt
├── quad_val.txt
├── waymo_train.txt
└── waymo_val.txt
```
## 4. Connect to the Codebase
- Clone the code repository:
```bash
git clone https://github.com/iris0329/3eed
cd 3eed
```
- Link or copy the downloaded dataset to `data/3eed/`:
```bash
# Example: if your dataset is in ../3eed_dataset
ln -s ../3eed_dataset data/3eed
```
Now you can follow the **Installation**, **Custom CUDA Operators**, **Training**, and **Evaluation** sections in the GitHub README:
* Train on all platforms:
```bash
bash scripts/train_3eed.sh
```
* Train on a single platform:
```bash
bash scripts/train_waymo.sh # vehicle
bash scripts/train_drone.sh # drone
bash scripts/train_quad.sh # quadruped
```
* Evaluate:
```bash
bash scripts/val_3eed.sh
bash scripts/val_waymo.sh
bash scripts/val_drone.sh
bash scripts/val_quad.sh
```
Remember to set the correct `--checkpoint_path` inside the evaluation scripts.
## 5. Data Splits
We provide official splits under `data/3eed/splits/`:
* `*_train.txt`: training scene/frame indices for each platform
* `*_val.txt`: validation scene/frame indices for each platform
Please keep these files unchanged for fair comparison with the baselines and reported results.
## 6. Usage Tips
* Storage: LiDAR+RGB data can be large; ensure sufficient disk space and use Git LFS for partial sync if needed.
* IO Throughput: For faster training/evaluation, place frequently used scenes on fast local SSDs or use caching.
* Reproducibility: Use the exact environment files and scripts from the code repo; platform unions vs. single-platform runs are controlled by the provided scripts.
## 7. License
* Dataset license: **Apache-2.0** (see the header of this page).
* The **code repository** uses **Apache-2.0**; refer to the LICENSE in the GitHub repo.
If you plan to use, redistribute, or modify the dataset, please review the dataset license and any upstream source licenses (e.g., Waymo Open Dataset, M3ED).
## 8. Citation
- If you find 3EED helpful, please cite:
```bibtex
@inproceedings{li2025_3eed,
title = {3EED: Ground Everything Everywhere in 3D},
author = {Rong Li and Yuhao Dong and Tianshuai Hu and Ao Liang and
Youquan Liu and Dongyue Lu and Liang Pan and Lingdong Kong and
Junwei Liang and Ziwei Liu},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS)
Datasets and Benchmarks Track},
year = {2025}
}
```
## 9. Acknowledgements
We acknowledge the following upstream sources which make this dataset possible:
* Waymo Open Dataset (vehicle platform)
* M3ED (drone and quadruped platforms)
For baseline implementations and evaluation code, please refer to the GitHub repository. |