handsomeYun nielsr HF Staff commited on
Commit
9d5c148
·
verified ·
1 Parent(s): de3fec5

Add M2I dataset card with metadata, paper, code, and sample usage (#1)

Browse files

- Add M2I dataset card with metadata, paper, code, and sample usage (4368ded1bd0d7aebd5ac73e7977935d1f473bb89)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +55 -0
README.md ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ task_categories:
3
+ - object-detection
4
+ license: unknown
5
+ tags:
6
+ - 3d-object-detection
7
+ - computer-vision
8
+ - autonomous-driving
9
+ - multi-view
10
+ - synthetic-data
11
+ - bird-s-eye-view
12
+ language:
13
+ - en
14
+ ---
15
+
16
+ # M2I Dataset
17
+
18
+ The M2I dataset is a synthetic dataset introduced in the paper [MIC-BEV: Multi-Infrastructure Camera Bird's-Eye-View Transformer with Relation-Aware Fusion for 3D Object Detection](https://huggingface.co/papers/2510.24688).
19
+
20
+ M2I is designed to support training and evaluation of models for infrastructure-based multi-camera 3D object detection. It features diverse camera configurations, road layouts, and environmental conditions. This dataset is crucial for developing robust perception systems in intelligent transportation environments and is used in conjunction with frameworks like MIC-BEV.
21
+
22
+ ## Paper
23
+
24
+ [MIC-BEV: Multi-Infrastructure Camera Bird's-Eye-View Transformer with Relation-Aware Fusion for 3D Object Detection](https://huggingface.co/papers/2510.24688)
25
+
26
+ ## Code
27
+
28
+ The official code repository for MIC-BEV, which utilizes this dataset, can be found here: [https://github.com/HandsomeYun/MIC-BEV](https://github.com/HandsomeYun/MIC-BEV)
29
+
30
+ ## Sample Usage
31
+
32
+ The M2I dataset is intended to be used with the MIC-BEV codebase for training and evaluation of 3D object detection models. The following snippets demonstrate how to prepare the dataset and perform quick training/evaluation, as outlined in the [MIC-BEV GitHub repository](https://github.com/HandsomeYun/MIC-BEV).
33
+
34
+ ### Prepare Dataset
35
+
36
+ To prepare the M2I-style data (e.g., V2XSet), use the provided preprocessing script from the MIC-BEV repository:
37
+
38
+ ```bash
39
+ python MIC-BEV_Official/tools/data_converter/mic-bev/create_v2xset_multiple_map.py
40
+ ```
41
+ This script processes raw V2XSet-format data and generates the necessary multi-map annotations for training MIC-BEV with M2I.
42
+
43
+ ### Quick Start (Training and Evaluation)
44
+
45
+ After setting up the environment (refer to the GitHub repository's `Installation` section for full details) and preparing the dataset, you can use the MIC-BEV configuration files for training and evaluation:
46
+
47
+ 1. **Training**: Use the MIC-BEV configuration file:
48
+ ```bash
49
+ python tools/train.py projects/configs/micbev/mic-bev-seg-gnn.py
50
+ ```
51
+
52
+ 2. **Evaluation**: Evaluate your trained model:
53
+ ```bash
54
+ python tools/test.py projects/configs/micbev/mic-bev-seg-gnn.py /path/to/checkpoint.pth --eval bbox
55
+ ```