RRRong commited on
Commit
129fe3c
·
verified ·
1 Parent(s): 0dea6d1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +161 -3
README.md CHANGED
@@ -1,3 +1,161 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: artistic-2.0
3
+ tags:
4
+ - visual-grounding
5
+ - lidar
6
+ - 3d
7
+ ---
8
+
9
+ # 3EED: Ground Everything Everywhere in 3D — Dataset Card
10
+
11
+ A cross-platform, multi-modal 3D visual grounding dataset spanning **vehicle**, **drone**, and **quadruped** platforms, with synchronized **RGB**, **LiDAR**, and **language** annotations. This page documents how to obtain and organize the dataset from HuggingFace and how to connect it with the training/evaluation code in the 3EED repository.
12
+
13
+ - Project Page: https://3eed.github.io
14
+ - Code (Baselines & Evaluation): https://github.com/iris0329/3eed
15
+ - Paper: https://arxiv.org/ (coming soon)
16
+
17
+ ## 1. What’s Included
18
+
19
+ - Platforms: `vehicle`, `drone`, `quad` (quadruped)
20
+ - Modalities: LiDAR point clouds, RGB images, language referring expressions, metadata
21
+ - Splits: train/val files per platform under `splits/`
22
+ - Task: 3D visual grounding (language → 3D box)
23
+
24
+ ## 2. Download
25
+
26
+ You can download via:
27
+ - HuggingFace CLI:
28
+ ```bash
29
+ pip install -U "huggingface_hub[cli]"
30
+ huggingface-cli download 3EED/3EED --repo-type dataset --local-dir ./3eed_dataset
31
+ ````
32
+
33
+ - Python:
34
+
35
+ ```python
36
+ from huggingface_hub import snapshot_download
37
+ snapshot_download(repo_id="3EED/3EED", repo_type="dataset", local_dir="./3eed_dataset")
38
+ ```
39
+ - Git (LFS):
40
+
41
+ ```bash
42
+ git lfs install
43
+ git clone https://huggingface.co/datasets/3EED/3EED 3eed_dataset
44
+ ```
45
+
46
+ ## 3. Directory Structure
47
+
48
+ - Place or verify the files under `data/3eed/` in your project. A minimal expected layout (paths shown relative to the repo root):
49
+
50
+ ```
51
+ data/3eed/
52
+ ├── drone/ # Drone platform data
53
+ │ ├── scene-0001/
54
+ │ │ ├── 0000_0/
55
+ │ │ │ ├── image.jpg
56
+ │ │ │ ├── lidar.bin
57
+ │ │ │ └── meta_info.json
58
+ │ │ └── ...
59
+ │ └── ...
60
+ ├── quad/ # Quadruped platform data
61
+ │ ├── scene-0001/
62
+ │ └── ...
63
+ ├── waymo/ # Vehicle platform data
64
+ │ ├── scene-0001/
65
+ │ └── ...
66
+ └── splits/ # Train/val split files
67
+ ├── drone_train.txt
68
+ ├── drone_val.txt
69
+ ├── quad_train.txt
70
+ ├── quad_val.txt
71
+ ├── waymo_train.txt
72
+ └── waymo_val.txt
73
+ ```
74
+
75
+
76
+ ## 4. Connect to the Codebase
77
+
78
+ - Clone the code repository:
79
+
80
+ ```bash
81
+ git clone https://github.com/iris0329/3eed
82
+ cd 3eed
83
+ ```
84
+
85
+ - Link or copy the downloaded dataset to `data/3eed/`:
86
+
87
+ ```bash
88
+ # Example: if your dataset is in ../3eed_dataset
89
+ ln -s ../3eed_dataset data/3eed
90
+ ```
91
+
92
+ Now you can follow the **Installation**, **Custom CUDA Operators**, **Training**, and **Evaluation** sections in the GitHub README:
93
+
94
+ * Train on all platforms:
95
+
96
+ ```bash
97
+ bash scripts/train_3eed.sh
98
+ ```
99
+ * Train on a single platform:
100
+
101
+ ```bash
102
+ bash scripts/train_waymo.sh # vehicle
103
+ bash scripts/train_drone.sh # drone
104
+ bash scripts/train_quad.sh # quadruped
105
+ ```
106
+ * Evaluate:
107
+
108
+ ```bash
109
+ bash scripts/val_3eed.sh
110
+ bash scripts/val_waymo.sh
111
+ bash scripts/val_drone.sh
112
+ bash scripts/val_quad.sh
113
+ ```
114
+
115
+ Remember to set the correct `--checkpoint_path` inside the evaluation scripts.
116
+
117
+ ## 5. Data Splits
118
+
119
+ We provide official splits under `data/3eed/splits/`:
120
+
121
+ * `*_train.txt`: training scene/frame indices for each platform
122
+ * `*_val.txt`: validation scene/frame indices for each platform
123
+
124
+ Please keep these files unchanged for fair comparison with the baselines and reported results.
125
+
126
+ ## 6. Usage Tips
127
+
128
+ * Storage: LiDAR+RGB data can be large; ensure sufficient disk space and use Git LFS for partial sync if needed.
129
+ * IO Throughput: For faster training/evaluation, place frequently used scenes on fast local SSDs or use caching.
130
+ * Reproducibility: Use the exact environment files and scripts from the code repo; platform unions vs. single-platform runs are controlled by the provided scripts.
131
+
132
+ ## 7. License
133
+
134
+ * Dataset license: **Apache-2.0** (see the header of this page).
135
+ * The **code repository** uses **Apache-2.0**; refer to the LICENSE in the GitHub repo.
136
+
137
+ If you plan to use, redistribute, or modify the dataset, please review the dataset license and any upstream source licenses (e.g., Waymo Open Dataset, M3ED).
138
+
139
+ ## 8. Citation
140
+
141
+ - If you find 3EED helpful, please cite:
142
+ ```bibtex
143
+ @inproceedings{li2025_3eed,
144
+ title = {3EED: Ground Everything Everywhere in 3D},
145
+ author = {Rong Li and Yuhao Dong and Tianshuai Hu and Ao Liang and
146
+ Youquan Liu and Dongyue Lu and Liang Pan and Lingdong Kong and
147
+ Junwei Liang and Ziwei Liu},
148
+ booktitle = {Advances in Neural Information Processing Systems (NeurIPS)
149
+ Datasets and Benchmarks Track},
150
+ year = {2025}
151
+ }
152
+ ```
153
+
154
+ ## 9. Acknowledgements
155
+
156
+ We acknowledge the following upstream sources which make this dataset possible:
157
+
158
+ * Waymo Open Dataset (vehicle platform)
159
+ * M3ED (drone and quadruped platforms)
160
+
161
+ For baseline implementations and evaluation code, please refer to the GitHub repository.