StaticEmbodiedBench / README.md
xiaojiahao's picture
Update README.md
47490ef verified
metadata
datasets:
  - StaticEmbodiedBench
language:
  - en
tags:
  - embodied-AI
  - vlm
  - vision-language
  - multiple-choice
license: mit
pretty_name: StaticEmbodiedBench
task_categories:
  - visual-question-answering

πŸ“˜ Dataset Description

StaticEmbodiedBench is a dataset for evaluating vision-language models on embodied intelligence tasks, as featured in the OpenCompass leaderboard.

It covers three key capabilities:

  • Macro Planning: Decomposing a complex task into a sequence of simpler subtasks.
  • Micro Perception: Performing concrete simple tasks such as spatial understanding and fine-grained perception.
  • Stage-wise Reasoning: Deciding the next action based on the agent’s current state and perceptual inputs.

Each sample is also labeled with a visual perspective:

  • First-Person View: The visual sensor is integrated with the agent, e.g., mounted on the end-effector.
  • Third-Person View: The visual sensor is separate from the agent, e.g., top-down or observer view.

This release includes 200 open-source samples from the full dataset, provided for public research and benchmarking purposes.


πŸ’‘ Usage

This dataset is fully supported by VLMEvalKit.

πŸ”§ Evaluate with VLMEvalKit

Registered dataset names:

  • StaticEmbodiedBench β€” for standard evaluation
  • StaticEmbodiedBench_circular β€” for circular evaluation (multi-round)

To run evaluation in VLMEvalKit:

python run.py --data StaticEmbodiedBench --model <your_model_name> --verbose

For circular evaluation, simply use:

python run.py --data StaticEmbodiedBench_circular --model <your_model_name> --verbose

πŸ“š Citation

If you use this dataset in your research, please cite it as follows:

@misc{staticembodiedbench,
  title     = {StaticEmbodiedBench},
  author    = {Jiahao Xiao, Shengyu Guo, Chunyi Li, Bowen Yan and Jianbo Zhang},
  year      = {2025},
  url       = {https://huggingface.co/datasets/xiaojiahao/StaticEmbodiedBench}
}