Datasets:

Sub-tasks:
grasping
Languages:
English
ArXiv:
License:
arm4r-data / README.md
Yuvan Sharma
update README.md
808e025
metadata
annotations_creators:
  - manual
language:
  - en
license: apache-2.0
multilinguality:
  - monolingual
pretty_name: ARM4R Dataset
size_categories:
  - 100K<examples<1M
source_datasets:
  - original
task_categories:
  - robotics
task_ids:
  - grasping

Pre-training Auto-regressive Robotic Models with 4D Representations

by Dantong Niu*, Yuvan Sharma*, Haoru Xue, Giscard Biamby, Junyi Zhang, Ziteng Ji, Trevor Darrell†, and Roei Herzig†

*Equal contribution, †Equal advising
Berkeley AI Research, UC Berkeley
ICML 2025

PaperCodeModelsDataset

The structure for the data is as follows:

.
├── .gitattributes
├── README.md
├── epic_clips.json  # contains mapping for episode id --> language instruction (76,014 episodes)
├── epic_tasks_final.zip  # contains extracted 3D point data for Epic-Kitchens (76,014 episodes)
└── real_kinova_release_data.zip # contains collected data for real world Kinova Gen3 setup (2,550 episodes)

Citation

If you find our work helpful, please consider citing:

@article{niu2025pre,
  title={Pre-training auto-regressive robotic models with 4d representations},
  author={Niu, Dantong and Sharma, Yuvan and Xue, Haoru and Biamby, Giscard and Zhang, Junyi and Ji, Ziteng and Darrell, Trevor and Herzig, Roei},
  journal={arXiv preprint arXiv:2502.13142},
  year={2025}
}