WorldGUI: An Interactive Benchmark for Desktop GUI Automation from Any Starting Point
What's new with WorldGUI Benchmark?
TL;DR: WorldGUI extends the evaluation of GUIs from a static to a dynamic testing process, which is more relevant for reflecting the complex and dynamic nature of GUI environments.
WorldGUI is an early work to stimulate dynamism in the real user-computer scenarios. As illustrated in above figure, most GUI benchmarks focus on initial and final states, measuring success rates but overlooking the changing initial conditions present in real GUI scenarios. These benchmarks often ignore situations where:
(1) The software interface is not in its default state.
(2) The agent might get user queries at any time.
(3) Differences in agent robustness, where agents with the same low success rate (e.g. 2%) may vary in their ability to self-verify or self-correct, but these abilities are not measured in a static setting.

Figure 1: Software taxonomy of WorldGUI and the performance comparison of GUI agents. The left shows 5 main groups and 10 software in our WorldGUI. The right shows that WorldGUI-Agent surpasses previous SOTA GUI agents significantly.

Figure 2: WorldGUI: The left shows that for each task, WorldGUI provides a user query, instructional video, and pre-actions. The pre-actions lead to different initial states. The key characteristic of our WorldGUI is the various initial states of the same task to stimulate the real-world testing process. The right shows the software included in our benchmark and the interactions about testing the agents in our GUI environment.

Table 1: WorldGUI is a unique benchmark that has the various states for each task to stimulate the real-world agent-computer interactions.
Data Statistic

Table 2: This table shows all tasks, task activities, and project file of the desktop applications used in WorldGUI.
Data Format
This is the example shows the meta task and aug task for the Project "adobeacrobat_00". We save all the meta information in file worldgui_metadata.json
.
[{
"Index ID": 0,
"Project ID": "adobeacrobat_00_meta",
"Task Category": "Office",
"Software Name": "Adobe Acrobat",
"User Query": "Edit PDF by removing selected text 'College of Design and Engineering'.",
"Project File Type": "project.pdf",
"project file path": "AdobeAcrobat/adobeacrobat_00/project.pdf",
"video path": "AdobeAcrobat/adobeacrobat_00/inst_video.mp4",
"Ground-Truth Plan": "Task 1: Edit a PDF\\nSubtask 1: Access the 'Edit PDF' tool located in the right-hand toolbar to enable editing features.\\nSubtask 2: Click to select the text 'College of Design and Engineering'. Use the mouse to highlight the specific text area.\\nSubtask 3: Remove the selected text by pressing the 'Delete' key on your keyboard, ensuring unwanted text is cleared.\\nSubtask 4: Click 'Close' to exit editing features.",
"Difficulty": "simple",
"Task Length": 4.0,
"Aug Type": "",
"Pre-Action": "",
"Comments": ""
},
{
"Index ID": 1,
"Project ID": "adobeacrobat_00_aug_01",
"Task Category": "Office",
"Software Name": "Adobe Acrobat",
"User Query": "Edit PDF by removing selected text 'College of Design and Engineering'.",
"Project File Type": "project.pdf",
"project file path": "AdobeAcrobat/adobeacrobat_00/project.pdf",
"video path": "AdobeAcrobat/adobeacrobat_00/inst_video.mp4",
"Ground-Truth Plan": "",
"Difficulty": "hard",
"Task Length": "",
"Aug Type": "Add-Step",
"Pre-Action": "from pyautogui import click, scroll\\nclick(540, 800)\\nscroll(-6000)",
"Comments": "Scroll the document downward to hide the target word"
},
...
]
๐ BibTeX
If you find WorldGUI useful, please cite using this BibTeX:
@misc{zhao2025worldguiinteractivebenchmarkdesktop,
title={WorldGUI: An Interactive Benchmark for Desktop GUI Automation from Any Starting Point},
author={Henry Hengyuan Zhao and Kaiming Yang and Wendi Yu and Difei Gao and Mike Zheng Shou},
year={2025},
eprint={2502.08047},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2502.08047},
}
- Downloads last month
- 139