|
|
--- |
|
|
pretty_name: InternData-A1 |
|
|
size_categories: |
|
|
- n>1T |
|
|
task_categories: |
|
|
- other |
|
|
- robotics |
|
|
language: |
|
|
- en |
|
|
tags: |
|
|
- Embodied-AI |
|
|
- Robotic manipulation |
|
|
extra_gated_prompt: >- |
|
|
### InternData-A1 COMMUNITY LICENSE AGREEMENT |
|
|
|
|
|
InternData-A1 Release Date: July 26, 2025. All the data and code within this |
|
|
repo are under [CC BY-NC-SA |
|
|
4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/). |
|
|
extra_gated_fields: |
|
|
First Name: text |
|
|
Last Name: text |
|
|
Email: text |
|
|
Country: country |
|
|
Affiliation: text |
|
|
Phone: text |
|
|
Job title: |
|
|
type: select |
|
|
options: |
|
|
- Student |
|
|
- Research Graduate |
|
|
- AI researcher |
|
|
- AI developer/engineer |
|
|
- Reporter |
|
|
- Other |
|
|
Research interest: text |
|
|
geo: ip_location |
|
|
By clicking Submit below I accept the terms of the license and acknowledge that the information I provide will be collected stored processed and shared in accordance with the InternData Privacy Policy: checkbox |
|
|
extra_gated_description: >- |
|
|
The information you provide will be collected, stored, processed and shared in |
|
|
accordance with the InternData Privacy Policy. |
|
|
extra_gated_button_content: Submit |
|
|
--- |
|
|
|
|
|
# InternData-A1 |
|
|
|
|
|
<div style="display: flex; justify-content: center; align-items: center; margin: 20px 0;"> |
|
|
<img src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/teaser.png" alt="Teaser Image" style="max-width: 100%; border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
</div> |
|
|
|
|
|
<div align="center" style="margin: 24px 0;"> |
|
|
<a href="https://arxiv.org/abs/2511.16651" style="text-decoration:none;"> |
|
|
<img src="https://img.shields.io/badge/arXiv-2511.16651-red.svg?logo=arxiv&logoColor=white" alt="arXiv Paper: 2511.16651" style="vertical-align: middle; margin-right: 24px; height: 25px;"> |
|
|
</a> |
|
|
<a href="https://internrobotics.github.io/interndata-a1.github.io/" style="text-decoration:none;"> |
|
|
<img src="https://img.shields.io/badge/Project%20Page-interndata--a1.github.io-1976d2?logo=githubpages&logoColor=white" alt="Project Homepage" style="vertical-align: middle; margin-right: 24px; height: 25px;"> |
|
|
</a> |
|
|
<a href="https://github.com/InternRobotics" style="text-decoration:none;"> |
|
|
<img src="https://img.shields.io/badge/Code-GitHub-181717?logo=github&logoColor=white" alt="Code Repository" style="vertical-align: middle; height: 25px;"> |
|
|
</a> |
|
|
</div> |
|
|
|
|
|
|
|
|
<strong>InternData-A1</strong> is a hybrid synthetic-real manipulation dataset containing over 630k trajectories and 7,433 hours across 4 embodiments, 18 skills, 70 tasks, and 227 scenes, covering rigid, articulated, deformable, and fluid-object manipulation. |
|
|
|
|
|
<div style="display: flex; justify-content: center; align-items: center; margin: 20px 0;"> |
|
|
<img src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/stats.png" alt="Stats Image" style="max-width: 100%; border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
</div> |
|
|
|
|
|
<div style="display: flex; flex-direction: column; align-items: center; gap: 10px;"> |
|
|
<!-- First Row --> |
|
|
<div style="display: flex; justify-content: center; align-items: center; gap: 10px;"> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/dynamic_pick.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/stack_sandwich.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
</div> |
|
|
<!-- Second Row --> |
|
|
<div style="display: flex; justify-content: center; align-items: center; gap: 10px;"> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/fold_shirts.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/pour_baijiu.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
</div> |
|
|
<!-- Third Row --> |
|
|
<div style="display: flex; justify-content: center; align-items: center; gap: 10px;"> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/flip_package.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
<video controls autoplay loop muted width="50%" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);"> |
|
|
<source src="https://huggingface.co/spaces/xushicd/InternData_Media/resolve/main/shelf_pick.mp4" type="video/mp4"> |
|
|
Your browser does not support the video tag. |
|
|
</video> |
|
|
</div> |
|
|
</div> |
|
|
|
|
|
# 🔑 Key Features |
|
|
- **Heterogeneous multi-robot platforms:** ARX Lift-2, AgileX Split Aloha, A2D, Franka |
|
|
- **Hybrid synthetic-real** manipulation demonstrations with **task-level digital twins**, containing four task categories: |
|
|
- Articulation tasks |
|
|
- Basic tasks |
|
|
- Long-horizon tasks |
|
|
- Pick and place tasks |
|
|
- **Diverse scenarios include:** |
|
|
- Moving Object Manipulation in Conveyor Belt Scenarios |
|
|
- Rigid, articulated, deformable, and fluid-object manipulation |
|
|
- Multi-robot / multi-arm collaboration |
|
|
- Human-robot interaction |
|
|
|
|
|
|
|
|
# 📋 Table of Contents |
|
|
- [Get started 🔥](#get-started-) |
|
|
- [Download the Dataset](#download-the-dataset) |
|
|
- [Dataset Structure](#dataset-structure) |
|
|
- [📅 TODO List ](#todo-list) |
|
|
- [License and Citation](#license-and-citation) |
|
|
|
|
|
# Get started 🔥 |
|
|
## Download the Dataset |
|
|
To download the full dataset, you can use the following code. If you encounter any issues, please refer to the official Hugging Face documentation. |
|
|
``` |
|
|
# Make sure you have git-lfs installed (https://git-lfs.com) |
|
|
git lfs install |
|
|
|
|
|
# When prompted for a password, use an access token with write permissions. |
|
|
# Generate one from your settings: https://huggingface.co/settings/tokens |
|
|
git clone https://huggingface.co/datasets/InternRobotics/InternData-A1 |
|
|
|
|
|
# If you want to clone without large files - just their pointers |
|
|
GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/datasets/InternRobotics/InternData-A1 |
|
|
``` |
|
|
|
|
|
## Dataset Structure |
|
|
### Folder hierarchy |
|
|
``` |
|
|
data |
|
|
├── sim |
|
|
│ ├── articulation_tasks |
|
|
│ │ └── ... |
|
|
│ ├── basic_tasks |
|
|
│ │ └── ... |
|
|
│ ├── long_horizon_tasks # category |
|
|
│ │ ├── franka # robot |
|
|
│ │ │ └── ... |
|
|
│ │ ├── lift2 |
|
|
│ │ │ ├── sort_the_rubbish # task |
|
|
│ │ │ │ ├── data |
|
|
│ │ │ │ │ ├── chunk-000 |
|
|
│ │ │ │ │ │ ├── episode_000000.parquet |
|
|
│ │ │ │ │ │ ├── episode_000001.parquet |
|
|
│ │ │ │ │ │ ├── episode_000002.parquet |
|
|
│ │ │ │ │ │ ├── ... |
|
|
│ │ │ │ │ ├── chunk-001 |
|
|
│ │ │ │ │ │ ├── ... |
|
|
│ │ │ │ │ ├── ... |
|
|
│ │ │ │ ├── meta |
|
|
│ │ │ │ │ ├── episodes.jsonl |
|
|
│ │ │ │ │ ├── episodes_stats.jsonl |
|
|
│ │ │ │ │ ├── info.json |
|
|
│ │ │ │ │ ├── modality.json |
|
|
│ │ │ │ │ ├── stats.json |
|
|
│ │ │ │ │ ├── tasks.jsonl |
|
|
│ │ │ │ ├── videos |
|
|
│ │ │ │ │ ├── chunk-000 |
|
|
│ │ │ │ │ │ ├── images.rgb.head |
|
|
│ │ │ │ │ │ │ ├── episode_000000.mp4 |
|
|
│ │ │ │ │ │ │ ├── episode_000001.mp4 |
|
|
│ │ │ │ │ │ │ ├── ... |
|
|
│ │ │ │ │ │ ├── ... |
|
|
│ │ │ │ │ ├── chunk-001 |
|
|
│ │ │ │ │ │ ├── ... |
|
|
│ │ │ │ │ ├── ... |
|
|
│ │ │ ├──... |
|
|
│ │ ├── split_aloha |
|
|
│ │ │ └── ... |
|
|
│ │ ├── ... |
|
|
│ ├── pick_and_place_tasks |
|
|
│ │ └── ... |
|
|
│ ├── ... |
|
|
├── real |
|
|
│ ├── ... |
|
|
``` |
|
|
|
|
|
This subdataset(such as `sort_the_rubbish`) was created using [LeRobot](https://github.com/huggingface/lerobot) (dataset v2.1). For GROOT training framework compatibility, additional `stats.json` and `modality.json` files are included, where `stats.json` provides statistical values (mean, std, min, max, q01, q99) for each feature across the dataset, and `modality.json` defines model-related custom modalities. |
|
|
|
|
|
### [meta/info.json](meta/info.json): |
|
|
```json |
|
|
{ |
|
|
"codebase_version": "v2.1", |
|
|
"robot_type": "piper", |
|
|
"total_episodes": 100, |
|
|
"total_frames": 49570, |
|
|
"total_tasks": 1, |
|
|
"total_videos": 300, |
|
|
"total_chunks": 1, |
|
|
"chunks_size": 1000, |
|
|
"fps": 30, |
|
|
"splits": { |
|
|
"train": "0:100" |
|
|
}, |
|
|
"data_path": "data/chunk-{episode_chunk:03d}/episode_{episode_index:06d}.parquet", |
|
|
"video_path": "videos/chunk-{episode_chunk:03d}/{video_key}/episode_{episode_index:06d}.mp4", |
|
|
"features": { |
|
|
"images.rgb.head": { |
|
|
"dtype": "video", |
|
|
"shape": [ |
|
|
480, |
|
|
640, |
|
|
3 |
|
|
], |
|
|
"names": [ |
|
|
"height", |
|
|
"width", |
|
|
"channel" |
|
|
], |
|
|
"info": { |
|
|
"video.fps": 30.0, |
|
|
"video.height": 720, |
|
|
"video.width": 1280, |
|
|
"video.channels": 3, |
|
|
"video.codec": "av1", |
|
|
"video.pix_fmt": "yuv420p", |
|
|
"video.is_depth_map": false, |
|
|
"has_audio": false |
|
|
} |
|
|
}, |
|
|
"images.rgb.hand_left": { |
|
|
"dtype": "video", |
|
|
"shape": [ |
|
|
480, |
|
|
640, |
|
|
3 |
|
|
], |
|
|
"names": [ |
|
|
"height", |
|
|
"width", |
|
|
"channel" |
|
|
], |
|
|
"info": { |
|
|
"video.fps": 30.0, |
|
|
"video.height": 480, |
|
|
"video.width": 640, |
|
|
"video.channels": 3, |
|
|
"video.codec": "av1", |
|
|
"video.pix_fmt": "yuv420p", |
|
|
"video.is_depth_map": false, |
|
|
"has_audio": false |
|
|
} |
|
|
}, |
|
|
"images.rgb.hand_right": { |
|
|
"dtype": "video", |
|
|
"shape": [ |
|
|
480, |
|
|
640, |
|
|
3 |
|
|
], |
|
|
"names": [ |
|
|
"height", |
|
|
"width", |
|
|
"channel" |
|
|
], |
|
|
"info": { |
|
|
"video.fps": 30.0, |
|
|
"video.height": 480, |
|
|
"video.width": 640, |
|
|
"video.channels": 3, |
|
|
"video.codec": "av1", |
|
|
"video.pix_fmt": "yuv420p", |
|
|
"video.is_depth_map": false, |
|
|
"has_audio": false |
|
|
} |
|
|
}, |
|
|
"states.left_joint.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
6 |
|
|
], |
|
|
"names": [ |
|
|
"left_joint_0", |
|
|
"left_joint_1", |
|
|
"left_joint_2", |
|
|
"left_joint_3", |
|
|
"left_joint_4", |
|
|
"left_joint_5" |
|
|
] |
|
|
}, |
|
|
"states.left_gripper.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": [ |
|
|
"left_gripper_0" |
|
|
] |
|
|
}, |
|
|
"states.right_joint.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
6 |
|
|
], |
|
|
"names": [ |
|
|
"right_joint_0", |
|
|
"right_joint_1", |
|
|
"right_joint_2", |
|
|
"right_joint_3", |
|
|
"right_joint_4", |
|
|
"right_joint_5" |
|
|
] |
|
|
}, |
|
|
"states.right_gripper.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": [ |
|
|
"right_gripper_0" |
|
|
] |
|
|
}, |
|
|
"actions.left_joint.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
6 |
|
|
], |
|
|
"names": [ |
|
|
"left_joint_0", |
|
|
"left_joint_1", |
|
|
"left_joint_2", |
|
|
"left_joint_3", |
|
|
"left_joint_4", |
|
|
"left_joint_5" |
|
|
] |
|
|
}, |
|
|
"actions.left_gripper.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": [ |
|
|
"left_gripper_0" |
|
|
] |
|
|
}, |
|
|
"actions.right_joint.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
6 |
|
|
], |
|
|
"names": [ |
|
|
"right_joint_0", |
|
|
"right_joint_1", |
|
|
"right_joint_2", |
|
|
"right_joint_3", |
|
|
"right_joint_4", |
|
|
"right_joint_5" |
|
|
] |
|
|
}, |
|
|
"actions.right_gripper.position": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": [ |
|
|
"right_gripper_0" |
|
|
] |
|
|
}, |
|
|
"timestamp": { |
|
|
"dtype": "float32", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": null |
|
|
}, |
|
|
"frame_index": { |
|
|
"dtype": "int64", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": null |
|
|
}, |
|
|
"episode_index": { |
|
|
"dtype": "int64", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": null |
|
|
}, |
|
|
"index": { |
|
|
"dtype": "int64", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": null |
|
|
}, |
|
|
"task_index": { |
|
|
"dtype": "int64", |
|
|
"shape": [ |
|
|
1 |
|
|
], |
|
|
"names": null |
|
|
} |
|
|
} |
|
|
} |
|
|
|
|
|
``` |
|
|
|
|
|
### key format in features |
|
|
Select appropriate keys for features based on characteristics such as ontology, single-arm or bimanual-arm, etc. |
|
|
``` |
|
|
|-- images |
|
|
|-- rgb |
|
|
|-- head |
|
|
|-- hand_left |
|
|
|-- hand_right |
|
|
|-- states |
|
|
|-- left_joint |
|
|
|-- position |
|
|
|-- right_joint |
|
|
|-- position |
|
|
|-- left_gripper |
|
|
|-- position |
|
|
|-- right_gripper |
|
|
|-- position |
|
|
|-- actions |
|
|
|-- left_joint |
|
|
|-- position |
|
|
|-- right_joint |
|
|
|-- position |
|
|
|-- left_gripper |
|
|
|-- position |
|
|
|-- right_gripper |
|
|
|-- position |
|
|
|
|
|
``` |
|
|
|
|
|
# 📅 TODO List |
|
|
- [x] Released: 632k simulation pretraining data (over 7433 hours). |
|
|
- [ ] To be released: real-world post-training data. |
|
|
|
|
|
# License and Citation |
|
|
All the data and code within this repo are under [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/). Please consider citing our project if it helps your research. |
|
|
|
|
|
```BibTeX |
|
|
@misc{contributors2025internroboticsrepo, |
|
|
title={InternData-A1}, |
|
|
author={InternData-A1 contributors}, |
|
|
howpublished={\url{https://github.com/InternRobotics/InternManip}}, year={2025} |
|
|
} |
|
|
``` |